MhaGNN: A Novel Framework for Wearable Sensor-Based Human Activity Recognition Combining Multi-Head Attention and Graph Neural Networks

Authors: Wang, Y., Wang, X., Yang, H., Geng, Y., Yu, H., Zheng, G. and Liao, L.

Journal: IEEE Transactions on Instrumentation and Measurement

Volume: 72

eISSN: 1557-9662

ISSN: 0018-9456

DOI: 10.1109/TIM.2023.3276004

Abstract:

Obtaining robust feature representations from multi-position wearable sensory data is challenging in human activity recognition (HAR) since data from different positions can have unordered implicit correlations. Graph neural networks (GNNs) represent data as structured graphs by mining complex relationships and interdependency via message passing between the nodes of graphs. This article proposes a novel framework (MhaGNN) that combines GNNs and the multi-head attention (MHA) mechanism, aiming to learn more informative representations for multi-position HAR tasks. The MhaGNN framework takes the sensor channels from multiple wearing positions as nodes to construct graph-structured data from the spatial dimension. Besides, the MHA mechanism is introduced to complete the message passing and aggregation of the graphs for spatial-temporal feature extraction. The MhaGNN learns correlations among sensor channels that can be used as compensatory features together with the captured features from each single sensor channel to enhance HAR. Experimental evaluations on three publicly available HAR datasets and a ground-truth dataset demonstrate that our proposed MhaGNN achieves state-of-the-art recognition performance with the captured rich features, including PAMAP2, OPPORTUNITY, MHEALTH, and MPWHAR.

Source: Scopus