Integrating Live Skeleton data into a VR Environment
Authors: Stephenson, D. and Hoxey, T.
The aim of this project is to be able to visualse live skeleton tracking data in a virtual analogue of a real world environment, to be viewed in VR. Using a single RGBD camera motion tracking method is a cost effective way to get real time 3D skeleton tracking data. Not only this but people being tracked don’t need any special markers. This makes it much more practical for use in a non studio or lab environment. However the skeleton it provides is not as accurate as a traditional multiple camera system. With a single fixed view point the body can easily occlude itself, for example by standing side on to the camera. Secondly without marked tracking points there can be inconsistencies with where the joints are identified, leading to in- consistent body proportions. In this paper we outline a method for improving the quality of motion capture data in real time, provid- ing an off the shelf framework for importing the data into a virtual scene. Our method uses a two stage approach to smooth smaller inconsistencies and try to estimate the position of improperly pro- portioned or occluded joints.