Travelling without moving: Auditory scene cues for translational self-motion
Journal article, 2005

Creating a sense of illusory self-motion is crucial for many Virtual Reality applications and the auditory modality is an essential, but often neglected, component for such stimulations. In this paper, perceptual optimization of auditory-induced, translational self-motion (vection) simulation is studied using binaurally synthesized and reproduced sound fields. The results suggest that auditory scene consistency and ecologically validity makes a minimum set of acoustic cues sufficient for eliciting auditory-induced vection. Specifically, it was found that a focused attention task and sound objects’ motion characteristics (approaching or receding) play an important role in self-motion perception. In addition, stronger sensations for auditory induced self-translation than for previously investigated self-rotation also suggest a strong ecological validity bias, as translation is the most common movement direction.

Auditory presence

vection

virtual reality

auditory induced self-motion

Author

Alexander Väljamäe

Chalmers, Civil and Environmental Engineering, Applied Acoustics

Chalmers, Signals and Systems, Communication, Antennas and Optical Networks

Pontus Larsson

Chalmers, Civil and Environmental Engineering, Applied Acoustics

Daniel Västfjäll

Chalmers, Civil and Environmental Engineering, Applied Acoustics

Mendel Kleiner

Chalmers, Civil and Environmental Engineering, Applied Acoustics

Proceedings of International Conference on Auditory Display

Subject Categories

Other Civil Engineering

More information

Latest update

12/5/2019