Y.J.Kee, M.N.Shah Zainudin



Recognizing the Activity Daily Living (ADL) of Subject Independent

pdf PDF


Recognition of an Activity Daily Living (ADL) has recently garnered for providing a piece of valuable information to a human. Small and easy to carry, a wearable sensor such as an accelerometer has opened the space for researchers to explore the prior knowledge of pervasive computing. In some ways, the wearable sensor has started to gain attention among researchers to conduct their research in a broad area of human activity recognition. Recent ADL is not only tackling simple activities but also cater to the broad categories of complex activities. However, the recognition accuracy tends to decrease when involving huge numbers of a subject. Even though the same activity has been conducted by a different subject, the acceleration signal obtained significantly differs. This happens due to the action pattern for each subject is different based on several aspects such as subject age, gender, emotion and personality. Thus, this paper is proposing the framework for tackling the subject independent matter by improving the recognition accuracy of ADL. The signal obtained from an accelerometer sensor to undergo a segmentation process to extract additional valuable features. In certain cases, some of the features might irrelevant to determine the class. Hence, we propose feature selection to select the most meaningful features which can lead to accuracy above 90%. On top of that, this paper also highlighted a short empirical review of previous related work. This preliminary work will be evaluated and analyzed using several machine learning algorithms.


Activity Daily Living (ADL), accelerometer, wearable sensor, machine learning, WISDM, subject independent.


Cite this paper

Y.J.Kee, M.N.Shah Zainudin. (2019) Recognizing the Activity Daily Living (ADL) of Subject Independent. International Journal of Control Systems and Robotics, 4, 101-108


Copyright © 2019 Author(s) retain the copyright of this article.
This article is published under the terms of the Creative Commons Attribution License 4.0