Real-Time Calibration and Registration Method for Indoor Scene with Joint Depth and Color Camera

This source preferred by Jian Chang and Feng Tian

This data was imported from Scopus:

Authors: Zhang, F., Lei, T., Li, J., Cai, X., Shao, X., Chang, J. and Tian, F.

Journal: International Journal of Pattern Recognition and Artificial Intelligence

ISSN: 0218-0014

DOI: 10.1142/S0218001418540216

© 2018 World Scientific Publishing Company Traditional vision registration technologies require the design of precise markers or rich texture information captured from the video scenes, and the vision-based methods have high computational complexity while the hardware-based registration technologies lack accuracy. Therefore, in this paper, we propose a novel registration method that takes advantages of RGB-D camera to obtain the depth information in real-time, and a binocular system using the Time of Flight (ToF) camera and a commercial color camera is constructed to realize the three-dimensional registration technique. First, we calibrate the binocular system to get their position relationships. The systematic errors are fitted and corrected by the method of B-spline curve. In order to reduce the anomaly and random noise, an elimination algorithm and an improved bilateral filtering algorithm are proposed to optimize the depth map. For the real-time requirement of the system, it is further accelerated by parallel computing with CUDA. Then, the Camshift-based tracking algorithm is applied to capture the real object registered in the video stream. In addition, the position and orientation of the object are tracked according to the correspondence between the color image and the 3D data. Finally, some experiments are implemented and compared using our binocular system. Experimental results are shown to demonstrate the feasibility and effectiveness of our method.

The data on this page was last updated at 04:43 on March 22, 2018.