Maintaining frame rate perception in interactive environments by exploiting audio-visual cross-modal interaction

Authors: Hulusić, V., Debattista, K., Aggarwal, V. and Chalmers, A.

Journal: Visual Computer

Volume: 27

Issue: 1

Pages: 57-66

ISSN: 0178-2789

DOI: 10.1007/s00371-010-0514-2

Abstract:

The entertainment industry, primarily the video games industry, continues to dictate the development and performance requirements of graphics hardware and computer graphics algorithms. However, despite the enormous progress in the last few years, it is still not possible to achieve some of industry's demands, in particular highfidelity rendering of complex scenes in real-time, on a single desktop machine. A realisation that sound/music and other senses are important to entertainment led to an investigation of alternative methods, such as cross-modal interaction in order to try and achieve the goal of "realism in real-time". In this paper we investigate the cross-modal interaction between vision and audition for reducing the amount of computation required to compute visuals by introducing movement related sound effects. Additionally, we look at the effect of camera movement speed on temporal visual perception. Our results indicate that slow animations are perceived as smoother than fast animations. Furthermore, introducing the sound effect of footsteps to walking animations further increased the animation smoothness perception. This has the consequence that for certain conditions, the number of frames that need to be rendered each second can be reduced, saving valuable computation time, without the viewer being aware of this reduction. The results presented are another step towards the full understanding of the auditoryvisual cross-modal interaction and its importance for helping achieve "realism int real-time". © Springer-Verlag 2010.

https://eprints.bournemouth.ac.uk/30364/

Source: Scopus

Maintaining frame rate perception in interactive environments by exploiting audio-visual cross-modal interaction

Authors: Hulusic, V., Debattista, K., Aggarwal, V. and Chalmers, A.

Journal: VISUAL COMPUTER

Volume: 27

Issue: 1

Pages: 57-66

eISSN: 1432-2315

ISSN: 0178-2789

DOI: 10.1007/s00371-010-0514-2

https://eprints.bournemouth.ac.uk/30364/

Source: Web of Science (Lite)

Maintaining frame rate perception in interactive environments by exploiting audio-visual cross-modal interaction

Authors: Hulusić, V., Debattista, K., Aggarwal, V. and Chalmers, A.

Journal: The Visual Computer

Volume: 27

Pages: 57-66

Publisher: Springer

https://eprints.bournemouth.ac.uk/30364/

Source: Manual

Maintaining frame rate perception in interactive environments by exploiting audio-visual cross-modal interaction

Authors: Hulusic, V., Debattista, K., Aggarwal, V. and Chalmers, A.

Journal: The Visual Computer

Volume: 27

Issue: 1

Pages: 57-66

ISSN: 0178-2789

Abstract:

The entertainment industry, primarily the video games industry, continues to dictate the development and performance requirements of graphics hardware and computer graphics algorithms. However, despite the enormous progress in the last few years it is still not possible to achieve some of industry’s demands, in particular high-fidelity rendering of complex scenes in real-time, on a single desktop machine. A realisation that sound/music and other senses are important to entertainment, led to an investigation of alternative methods, such as cross-modal interaction in order to try and achieve the goal of “realism in real-time”. In this paper we investigate the cross-modal interaction between vision and audition for reducing the amount of computation required to compute visuals by introducing movement related sound effects. Additionally, we look at the effect of camera movement speed on temporal visual perception. Our results indicate that slow animations are perceived as smoother than fast animations.

Furthermore, introducing the sound effect of footsteps to walking animations further increased the animation smoothness perception. This has the consequence that for certain conditions the number of frames that need to be rendered each second can be reduced, saving valuable computation time, without the viewer being aware of this reduction. The results presented are another step towards the full understanding of the auditory-visual cross-modal interaction and its importance for helping achieve “realism int real-time”.

https://eprints.bournemouth.ac.uk/30364/

Source: BURO EPrints