Machine Learning Model for Human Activity Analysis
##plugins.themes.bootstrap3.article.main##
Human Activity Recognition is an active subject of research and scientific progress in which several models have been presented for identifying and categorizing activities using Machine Learning utilizing various methodologies. The purpose of human activity recognition is to look at activities in video or still photos. Human activity recognition systems are motivated by this fact, and their goal is to appropriately classify input data into its underlying activity category. Human activities are classified as (a) gestures, (b) atomic actions, (c) human-to-object or human-to-human interactions, (d) collective actions, (e) behaviors, and (f) events, depending on their complexity. Today, health informatics is a critical field for improving healthcare efficiency by streamlining the collecting, storage, and retrieval of critical patient health data. In this paper, an intelligent smart healthcare system is provided that uses machine learning approaches to deliver ubiquitous human activity recognition (HAR) in an automated manner. The goal is to model and recognize activities of everyday living in a precise and efficient manner. Furthermore, for HAR purposes, we focus on a dataset collecting body motion and vital sign recordings from volunteers of various profiles while performing various physical activities. This research has demonstrated that identifying human activity from sensor data is extremely difficult, even with the availability of a number of machine learning approaches. When it comes to machine learning techniques, there is no one-size-fits-all approach.
References
-
Kanade P, Prasad JP. Machine Learning Techniques in Plant Conditions Classification and Observation. 2021 5th International Conference on Computing Methodologies and Communication (ICCMC) IEEE. 2021; 729-734.
Google Scholar
1
-
Turaga P, Chellappa R, Subrahmanian VS, Udrea O. Machine recognition of human activities: a survey. IEEE Transactions on Circuits and Systems for Video Technology. 2008; 18(11): 1473-1488.
Google Scholar
2
-
Lee D, Yang M-H, Oh S. Fast and accurate head pose estimation via random projection forests. Proceedings of the IEEE International Conference on Computer Vision (ICCV ’15). IEEE. 2015:1958-1966.
Google Scholar
3
-
Yousefi S, Narui H, Dayal S, Ermon S, Valaee S. A survey on behavior recognition using wifi channel state information. IEEE Comms Mag. 2017; 55(10).
Google Scholar
4
-
Li H, Ota K, Dong M, Guo M. Learning human activities through wi-fi channel state information with multiple access points. IEEE Com Mag. 2018; 56(5).
Google Scholar
5
-
Qolomany B, Al-Fuqaha A, Benhaddou D, Gupta A. Role of deep lstm neural networks and wi-fi networks in support of occupancy prediction in smart buildings. IEEE 19th Int. Conf. on HPCC; IEEE 15th Int. Conf. on SmartCity; IEEE 3rd Int. Conf. on DSS. 2017: 50-577.
Google Scholar
6
-
Zou H, Zhou Y, Yang J, Spanos CJ. Towards occupant activity driven smart buildings via wi-fi enabled iot devices and deep learning. Energy and Buildings. 2018; 177: 12-22.
Google Scholar
7
-
Kanade P, Prasad JP. Arduino Based Machine Learning and IoT Smart Irrigation System. International Journal of Soft Computing and Engineering (IJSCE). 2021; 10(4): 1-5.
Google Scholar
8
-
Agarwal I, Kushwaha AKS, Srivastava R. (2015). Weighted Fast Dynamic Time Warping Based Multiview Human Activity Recognition Using a RGB-D Sensor. [Internet] 2015. Available from: https:// ieeexplore.ieee.org/ document / 7490046.
Google Scholar
9
-
Sorkun MC, Danişman AE, Durmaz İncel Ö. Human Activity Recognition With Mobile Phone Sensors: Impact Of Sensors And Window Size. [Internet] 2015. Available from: https://ieeexplore.ieee.org /document/ 8404569.
Google Scholar
10
-
Kanade P, Prasad JP, Kanade S. IOT based Smart Healthcare Wheelchair for Independent Elderly. European Journal of Electrical Engineering and Computer Science. 2021; 5(5): 4-9.
Google Scholar
11
-
Käse N, Babaee M, Rigoll G. Multi-view human activity recognition using motion frequency. [Internet] 2017. Available from: https://sigport.org /sites/default /files /docs/ICIP_PaperID1443_Final.pdf.
Google Scholar
12
-
Kanade P, Prasad JP. Mobile and Location Based Service using Augmented Reality: A Review. European Journal of Electrical Engineering and Computer Science. 2021; 5(2): 13-18.
Google Scholar
13
-
Chen Z, Zhu Q, Soh YC, Zhang L. Robust human activity recognition using smartphone sensors via ct-pca and online svm. IEEE Transactions on Industrial Informatics. 2017; 13(6): 3070-3080.
Google Scholar
14
-
Sun L, Zhang D, Li B, Guo B, Li S. Activity recognition on an accelerometer embedded mobile phone with varying positions and orientations. International conference on ubiquitous intelligence and computing. Springer, 2010: 548–562.
Google Scholar
15
-
Jahangiri A, Rakha HA. Applying machine learning techniques to transportation mode recognition using mobile phone sensor data,” IEEE transactions on intelligent transportation systems. 2015; 16(5): 2406-2417.
Google Scholar
16
-
Kanade P, Kanade S. Medical Assistant Robot ARM for COVID-19 Patients Treatment – A Raspberry Pi Project. International Research Journal of Engineering and Technology. 2020; 7(10): 105-111.
Google Scholar
17
-
Ji S, Xu W, Yang M, Yu K. 3D Convolutional Neural Networks for Human Action Recognition. [Internet] 2012. Available from: https://ieeexplore.ieee.org /document / 6165309.
Google Scholar
18
-
Simonyan K, Zisserman A. Two-Stream Convolutional Networks for Action Recognition in Videos. [Internet] 2014. Available from: https://arxiv.org/abs/1406.2199.
Google Scholar
19
-
Ng JH-Y, Hausknecht M, Vijayanarasimhan S, Vinyals O, Monga, Toderici RG. Beyond Short Snippets: Deep Networks for Video Classification. [Internet] 2015. Available from: https://arxiv.org/abs/1503.08909.
Google Scholar
20
-
Kanade P, Alva P, Prasad JP, Kanade S. Smart Garbage Monitoring System using Internet of Things (IoT). 2021 5th International Conference on Computing Methodologies and Communication (ICCMC). IEEE. 2021: 330-335.
Google Scholar
21
-
Kanade P, Alva P, Kanade S, Ghatwal S. Automated Robot ARM using Ultrasonic Sensor in Assembly Line. International Research Journal of Engineering and Technology (IRJET). 2020; 7(12): 615-620.
Google Scholar
22
Most read articles by the same author(s)
-
Pratiksha Pradip Pandao,
Abhi Rathi,
Prince Patel,
Smart Irrigation System Using Intelligent Robotics , European Journal of Information Technologies and Computer Science: Vol. 1 No. 5 (2021) -
Alia Rifat,
Prince Patel,
B. Shoban Babu,
The Internet of Things (IOT) in Smart Agriculture Monitoring , European Journal of Information Technologies and Computer Science: Vol. 2 No. 1 (2022)