Clothes size prediction from dressed-human silhouettes

Authors: Song, D., Tong, R., Chang, J., Wang, T., Du, J., Tang, M. and Zhang, J.J.

Journal: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Volume: 10582 LNCS

Pages: 86-98

eISSN: 1611-3349

ISBN: 9783319694863

ISSN: 0302-9743

DOI: 10.1007/978-3-319-69487-0_7

Abstract:

We propose an effective and efficient way to automatically predict clothes size for users to buy clothes online. We take human height and dressed-human silhouettes in front and side views as input, and estimate 3D body sizes with a data-driven method. We adopt 20 body sizes which are closely related to clothes size, and use such 3D body sizes to get clothes size by searching corresponding size chart. Previous image-based methods need to calibrate camera to estimate 3D information from 2D images, because the same person has different appearances of silhouettes (e.g. size and shape) when the camera configuration (intrinsic and extrinsic parameters) is different. Our method avoids camera calibration, which is much more convenient. We set up our virtual camera and train the relationship between human height and silhouette size under this camera configuration. After estimating silhouette size, we regress the positions of 2D body landmarks. We define 2D body sizes as the distances between corresponding 2D body landmarks. Finally, we learn the relationship between 2D body sizes and 3D body sizes. The training samples for each regression process come from a database of 3D naked and dressed bodies created by previous work. We evaluate the whole procedure and each process of our framework. We also compare the performance with several regression models. The total time-consumption for clothes size prediction is less than 0.1, s and the average estimation error of body sizes is 0.824, cm, which can satisfy the tolerance for customers to shop clothes online.

https://eprints.bournemouth.ac.uk/30129/

Source: Scopus

Clothes size prediction from dressed-human silhouettes

Authors: Song, D., Tong, R., Chang, J., Wang, T., Du, J., Tang, M. and Zhang, J.J.

Conference: Next Generation Computer Animation Techniques Third International Workshop, AniNex 2017

Pages: 86-98

ISBN: 9783319694863

ISSN: 0302-9743

Abstract:

© 2017, Springer International Publishing AG. We propose an effective and efficient way to automatically predict clothes size for users to buy clothes online. We take human height and dressed-human silhouettes in front and side views as input, and estimate 3D body sizes with a data-driven method. We adopt 20 body sizes which are closely related to clothes size, and use such 3D body sizes to get clothes size by searching corresponding size chart. Previous image-based methods need to calibrate camera to estimate 3D information from 2D images, because the same person has different appearances of silhouettes (e.g. size and shape) when the camera configuration (intrinsic and extrinsic parameters) is different. Our method avoids camera calibration, which is much more convenient. We set up our virtual camera and train the relationship between human height and silhouette size under this camera configuration. After estimating silhouette size, we regress the positions of 2D body landmarks. We define 2D body sizes as the distances between corresponding 2D body landmarks. Finally, we learn the relationship between 2D body sizes and 3D body sizes. The training samples for each regression process come from a database of 3D naked and dressed bodies created by previous work. We evaluate the whole procedure and each process of our framework. We also compare the performance with several regression models. The total time-consumption for clothes size prediction is less than 0.1, s and the average estimation error of body sizes is 0.824, cm, which can satisfy the tolerance for customers to shop clothes online.

https://eprints.bournemouth.ac.uk/30129/

https://link.springer.com/content/pdf/10.1007%2F978-3-319-69487-0.pdf

Source: BURO EPrints