Convolutional Neural Networks for Human Activity Recognition using Mobile SensorsProc. 6th International Conference on Mobile Computing, Applications and Services (MobiCASE-16) (2014)
A variety of real-life mobile sensing applications are becoming available, especially in the life-logging, fitness tracking and health monitoring domains. These applications use mobile sensors embedded in smart phones to recognize human activities in order to get a better understanding of human behavior. While progress has been made, human activity recognition remains a challenging task. This is partly due to the broad range of human activities as well as the rich variation in how a given activity can be performed. Using features that clearly separate between activities is crucial. In this paper, we propose an approach to automatically extract discriminative features for activity recognition. Specifically, we develop a method based on Convolutional Neural Networks (CNN), which can capture local dependency and scale invariance of a signal as it has been shown in speech recognition and image recognition domains. In addition, a modified weight sharing technique, called partial weight sharing, is proposed and applied to accelerometer signals to get further improvements. The experimental results on three public datasets, Skoda (assembly line activities), Opportunity (activities in kitchen), Actitracker (jogging, walking, etc.), indicate that our novel CNN-based approach is practical and achieves higher accuracy than existing state-of-the-art methods.
- Deep Learning,
- Convolutional Neural Network,
- Human Activity Recognition
Publication DateNovember 7, 2014
Citation InformationMing Zeng, Le T Nguyen, Bo Yu, Ole J Mengshoel, et al.. "Convolutional Neural Networks for Human Activity Recognition using Mobile Sensors" Proc. 6th International Conference on Mobile Computing, Applications and Services (MobiCASE-16) (2014) p. 197 - 205
Available at: http://works.bepress.com/ole_mengshoel/59/