Clothes size prediction from dressed-human silhouettes
This data was imported from Scopus:
Journal: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume: 10582 LNCS
© 2017, Springer International Publishing AG. We propose an effective and efficient way to automatically predict clothes size for users to buy clothes online. We take human height and dressed-human silhouettes in front and side views as input, and estimate 3D body sizes with a data-driven method. We adopt 20 body sizes which are closely related to clothes size, and use such 3D body sizes to get clothes size by searching corresponding size chart. Previous image-based methods need to calibrate camera to estimate 3D information from 2D images, because the same person has different appearances of silhouettes (e.g. size and shape) when the camera configuration (intrinsic and extrinsic parameters) is different. Our method avoids camera calibration, which is much more convenient. We set up our virtual camera and train the relationship between human height and silhouette size under this camera configuration. After estimating silhouette size, we regress the positions of 2D body landmarks. We define 2D body sizes as the distances between corresponding 2D body landmarks. Finally, we learn the relationship between 2D body sizes and 3D body sizes. The training samples for each regression process come from a database of 3D naked and dressed bodies created by previous work. We evaluate the whole procedure and each process of our framework. We also compare the performance with several regression models. The total time-consumption for clothes size prediction is less than 0.1, s and the average estimation error of body sizes is 0.824, cm, which can satisfy the tolerance for customers to shop clothes online.