Multi-Sensor Information Fusion and Machine Learning for High Accuracy Rate of Mechanical Pedometer in Human Activity Recognition

Authors: Adjeisah, M., Liu, G., Nyabuga, D.O. and Nortey, R.N.

Conference: 2019 IEEE Intl Conf on Parallel & Distributed Processing with Applications, Big Data & Cloud Computing, Sustainable Computing & Communications, Social Computing & Networking (ISPA/BDCloud/SocialCom/SustainCom)

Dates: 16-18 December 2019

Abstract:

Many Human Activity Recognition (HAR) research works have been engineered, however, only a few studies combine Kinect and the smartphone. Also, most findings are geared towards wearable sensors, which however may develop wearer discomfort, predominantly for nonstop action applications where one may perhaps have to wear for a lengthy period. Additionally, while the novelties of these researches are encouraging, accuracy is subject to fluctuation over time. To address these concerns, this paper reports the fusion of sensor data to determine a robust accuracy. The system uses Microsoft Kinect Sensor (MKS) to detect angle change in the hip-knee-ankle and a gravitational accelerometer of a smartphone for step tracking while engaging users. In our study, three methods are compared: 1) Manually counted steps, 2) raw data from our algorithm and 3) implementation of parameterized classification algorithm classifiers, k-Nearest Neighbors (kNN) to improve the step counts. Best-first search hierarchy containing the maximum distance of the kNN algorithm is engaged. Varying k, an accuracy rate of 96.44% is achieved with σ = 0.040.

Source: Manual