Real-time motion data annotation via action string
Authors: Qi, T., Xiao, J., Zhuang, Y., Zhang, H., Yang, X., Zhang, J.J. and Feng, Y.
Journal: Computer Animation and Virtual Worlds
Volume: 25
Pages: 293-302
ISSN: 1546-4261
Abstract:Even though there is an explosive growth of motion capture data, there is still a lack of efficient and reliable methods to automatically annotate all the motions in a database. Moreover, because of the popularity of mocap devices in home entertainment systems, real-time human motion annotation or recognition becomes more and more imperative. This paper presents a new motion annotation method that achieves both the aforementioned two targets at the same time. It uses a probabilistic pose feature based on the Gaussian Mixture Model to represent each pose. After training a clustered pose feature model, a motion clip could be represented as an action string. Then, a dynamic programming-based string matching method is introduced to compare the differences between action strings. Finally, in order to achieve the real-time target, we construct a hierarchical action string structure to quickly label each given action string. The experimental results demonstrate the efficacy and efficiency of our method.
https://eprints.bournemouth.ac.uk/21399/
http://dx.doi.org/10.1002/cav.1590
Source: BURO EPrints