Adaptive multi-view feature selection for human motion retrieval

Authors: Wang, Z., Feng, Y., Qi, T., Yang, X. and Zhang, J.J.

Journal: Signal Processing

Volume: 120

Pages: 691-701

ISSN: 0165-1684

DOI: 10.1016/j.sigpro.2014.11.015

Abstract:

Human motion retrieval plays an important role in many motion data based applications. In the past, many researchers tended to use a single type of visual feature as data representation. Because different visual feature describes different aspects about motion data, and they have dissimilar discriminative power with respect to one particular class of human motion, it led to poor retrieval performance. Thus, it would be beneficial to combine multiple visual features together for motion data representation. In this article, we present an Adaptive Multi-view Feature Selection (AMFS) method for human motion retrieval. Specifically, we first use a local linear regression model to automatically learn multiple view-based Laplacian graphs for preserving the local geometric structure of motion data. Then, these graphs are combined together with a non-negative view-weight vector to exploit the complementary information between different features. Finally, in order to discard the redundant and irrelevant feature components from the original high-dimensional feature representation, we formulate the objective function of AMFS as a general trace ratio optimization problem, and design an effective algorithm to solve the corresponding optimization problem. Extensive experiments on two public human motion database, i.e., HDM05 and MSR Action3D, demonstrate the effectiveness of the proposed AMFS over the state-of-art methods for motion data retrieval. The scalability with large motion dataset, and insensitivity with the algorithm parameters, make our method can be widely used in real-world applications.

https://eprints.bournemouth.ac.uk/21812/

Source: Scopus

Preferred by: Jian Jun Zhang and Xiaosong Yang

Adaptive multi-view feature selection for human motion retrieval

Authors: Wang, Z., Feng, Y., Qi, T., Yang, X. and Zhang, J.J.

Journal: SIGNAL PROCESSING

Volume: 120

Pages: 691-701

eISSN: 1872-7557

ISSN: 0165-1684

DOI: 10.1016/j.sigpro.2014.11.015

https://eprints.bournemouth.ac.uk/21812/

Source: Web of Science (Lite)

Adaptive multi-view feature selection for human motion retrieval

Authors: Wang, Z., Feng, Y., Qi, T., Yang, X. and Zhang, J.J.

Journal: Signal Processing

Volume: 120

Pages: 691-701

ISSN: 0165-1684

Abstract:

Human motion retrieval plays an important role in many motion data based applications. In the past, many researchers tended to use a single type of visual feature as data representation. Because different visual feature describes different aspects about motion data, and they have dissimilar discriminative power with respect to one particular class of human motion, it led to poor retrieval performance. Thus, it would be beneficial to combine multiple visual features together for motion data representation. In this article, we present an Adaptive Multi-view Feature Selection (AMFS) method for human motion retrieval. Specifically, we first use a local linear regression model to automatically learn multiple view-based Laplacian graphs for preserving the local geometric structure of motion data. Then, these graphs are combined together with a non-negative view-weight vector to exploit the complementary information between different features. Finally, in order to discard the redundant and irrelevant feature components from the original high-dimensional feature representation, we formulate the objective function of AMFS as a general trace ratio optimization problem, and design an effective algorithm to solve the corresponding optimization problem. Extensive experiments on two public human motion database, i.e., HDM05 and MSR Action3D, demonstrate the effectiveness of the proposed AMFS over the state-of-art methods for motion data retrieval. The scalability with large motion dataset, and insensitivity with the algorithm parameters, make our method can be widely used in real-world applications.

https://eprints.bournemouth.ac.uk/21812/

Source: BURO EPrints