Adaptive motion synthesis for virtual characters: a survey

This source preferred by Jian Jun Zhang, Jian Chang and Richard Southern

This data was imported from Scopus:

Authors: Guo, S., Southern, R., Chang, J., Greer, D. and Zhang, J.J.

Journal: Visual Computer

Volume: 31

Issue: 5

Pages: 497-512

ISSN: 0178-2789

DOI: 10.1007/s00371-014-0943-4

© 2014, Springer-Verlag Berlin Heidelberg. Character motion synthesis is the process of artificially generating natural motion for a virtual character. In film, motion synthesis can be used to generate difficult or dangerous stunts without putting performers at risk. In computer games and virtual reality, motion synthesis enriches the player or participant experience by allowing for unscripted and emergent character behavior. In each of these applications the ability to adapt to changes to environmental conditions or to the character in a smooth and natural manner, while still conforming with user-specified constraints, determines the utility of a method to animators and industry practitioners. This focus on adaptation capability distinguishes our survey from other reviews which focus on general technology developments. Three main methodologies (example-based; simulation-based and hybrid) are summarised and evaluated using compound metrics: adaptivity, naturalness and controllability. By assessing existing techniques according to this classification we are able to determine how well a method corresponds to users’ expectations. We discuss optimization strategies commonly used in motion synthesis literature, and also contemporary perspectives from biology which give us a deeper insight into this problem. We also present observations and reflections from industry practitioners to reveal the operational constraints of character motion synthesis techniques. Our discussion and review presents a unique insight into the subject, and provide essential guidance when selecting appropriate methods to design an adaptive motion controller.

The data on this page was last updated at 04:48 on February 24, 2018.