Digital Relief Generation from 3D Models
This data was imported from Scopus:
Authors: Wang, M., Sun, Y., Zhang, H., Qian, K., Chang, J. and He, D.
Journal: Chinese Journal of Mechanical Engineering (English Edition)
© Chinese Mechanical Engineering Society and Springer-Verlag Berlin Heidelberg 2016. It is difficult to extend image-based relief generation to high-relief generation, as the images contain insufficient height information. To generate reliefs from three-dimensional (3D) models, it is necessary to extract the height fields from the model, but this can only generate bas-reliefs. To overcome this problem, an efficient method is proposed to generate bas-reliefs and high-reliefs directly from 3D meshes. To produce relief features that are visually appropriate, the 3D meshes are first scaled. 3D unsharp masking is used to enhance the visual features in the 3D mesh, and average smoothing and Laplacian smoothing are implemented to achieve better smoothing results. A nonlinear variable scaling scheme is then employed to generate the final bas-reliefs and high-reliefs. Using the proposed method, relief models can be generated from arbitrary viewing positions with different gestures and combinations of multiple 3D models. The generated relief models can be printed by 3D printers. The proposed method provides a means of generating both high-reliefs and bas-reliefs in an efficient and effective way under the appropriate scaling factors.
This source preferred by Jian Chang
This data was imported from Web of Science (Lite):
Authors: Meili, W., Yu, S., Hongming, Z., Kun, Q., Jian, C. and Dongjian, H.
Journal: CHINESE JOURNAL OF MECHANICAL ENGINEERING