Virtual Scenes Construction Promotes Traditional Chinese Art Preservation

Authors: Liang, H., Bao, F., Sun, Y., Ge, C. and Chang, J.

Journal: Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Volume: 13002 LNCS

Pages: 621-632

eISSN: 1611-3349

ISBN: 9783030890285

ISSN: 0302-9743

DOI: 10.1007/978-3-030-89029-2_46

Abstract:

Chinese traditional opera is a valuable and fascinating heritage assert in the world as one of the most representative folk art in Chinese history. Its characteristic of ‘suppositionality’ in stage scenery provides a possibility of preservation of cultural heritage by digitization means, e.g., 3D Animation and Virtual Reality-based art show. In this novel digitization art form, the construction of virtual scenes is an important pillar--variety of created models should be accommodated to provide a vivid performance stage, including stage props and background. However, the generation of scenes based on traditional manual 3D virtual props modelling method is a tedious and strenuous task. In this paper, a novel shadow puppetry virtual stage scenes construction approach based on semantic and prior probability is proposed for the generation of compositional virtual scenes. First, primitive models based on semantics text segmentation and retrieval is provided for scene composition; and then, scene placement algorithm based on prior probability is conducive to assign these 3D models within virtual scene. This method is tested by generating the virtual performance stage for our shadow puppetry prototype system, within which various traditional art-specific 3D models are assembled. Its ease of use can assist artists to create visually plausible virtual stage without professional scene modelling skill. The user study indicates our approach’s effectiveness and its efficiency.

https://eprints.bournemouth.ac.uk/36270/

Source: Scopus

Virtual Scenes Construction Promotes Traditional Chinese Art Preservation

Authors: Liang, H., Bao, F., Sun, Y., Ge, C. and Chang, J.

Journal: ADVANCES IN COMPUTER GRAPHICS, CGI 2021

Volume: 13002

Pages: 621-632

eISSN: 1611-3349

ISBN: 978-3-030-89028-5

ISSN: 0302-9743

DOI: 10.1007/978-3-030-89029-2_46

https://eprints.bournemouth.ac.uk/36270/

Source: Web of Science (Lite)

Virtual Scenes Construction Promotes Traditional Chinese Art Preservation

Authors: Liang, H., Bao, F., Sun, Y., Ge, C. and Chang, J.

Conference: Computer Graphics International Conference CGI 2021: Advances in Computer Graphics

Pages: 621-632

Publisher: Springer

ISBN: 9783030890285

ISSN: 0302-9743

Abstract:

Chinese traditional opera is a valuable and fascinating heritage assert in the world as one of the most representative folk art in Chinese history. Its characteristic of ‘suppositionality’ in stage scenery provides a possibility of preservation of cultural heritage by digitization means, e.g., 3D Animation and Virtual Reality-based art show. In this novel digitization art form, the construction of virtual scenes is an important pillar--variety of created models should be accommodated to provide a vivid performance stage, including stage props and background. However, the generation of scenes based on traditional manual 3D virtual props modelling method is a tedious and strenuous task. In this paper, a novel shadow puppetry virtual stage scenes construction approach based on semantic and prior probability is proposed for the generation of compositional virtual scenes. First, primitive models based on semantics text segmentation and retrieval is provided for scene composition; and then, scene placement algorithm based on prior probability is conducive to assign these 3D models within virtual scene. This method is tested by generating the virtual performance stage for our shadow puppetry prototype system, within which various traditional art-specific 3D models are assembled. Its ease of use can assist artists to create visually plausible virtual stage without professional scene modelling skill. The user study indicates our approach’s effectiveness and its efficiency.

https://eprints.bournemouth.ac.uk/36270/

Source: BURO EPrints