Sound for Multisensory Motion Simulators
Doktorsavhandling, 2007

Interaction in a virtual reality environment often implies situations of illusory self-motion, like, for example, in flight or driving scenarios. Striving for pictorial realism, currently available motion simulators often exhibit relatively poor sound design. However, a substantial body of research has now conclusively shown that human perception is multisensory in its nature. It is, therefore, logical to assume that acoustic information should contribute to the perception of illusory self-motion (vection). The presented studies used an iterative synthesis-evaluation loop where participants’ vection and presence (the sense of “being there”) responses guided the search for the most salient auditory cues and their multisensory combinations. Paper A provides a first integrative review on the studies related to auditory induced illusory vection, which have been scattered over time and different research disciplines. Paper B explores optimal combinations of perceptual cues between vision (field-of-view) and audition (spatial resolution) when presenting a rotating environment. Paper C examines cognitive factors in purely auditory or auditory-vibrotactile induced circular vection. In Paper D the specific influence of an audio-vibrotactile engine sound metaphor on linear vection responses are evaluated. The idea of using the engine sound representing self-motion or its multisensory counterparts is further addressed in Paper E where participant’s imagery vividness scores are also considered. The results from Papers B-E serve as a basis for the design of a transportable, multimodal motion simulator prototype. In paper F the feasibility of inducing vection by means of binaural bone conducted sound is tested using this prototype. Paper G outlines perceptually optimized, multisensory design which can be used in future motion simulators and discusses its possible implications for entertainment industries. To conclude, sound is an important but often neglected component in multisensory self-motion simulations, providing both perceptual and cognitive cues. Hence, it might be beneficial to think in terms of the amodal categories of unitary space, time, objects and events rather than to optimize vection cues in different modalities separately. The presented results have implications for various research areas including multisensory integration of self-motion cues, posture prosthesis, navigation in unusual gravitoinertial environments and applications for visually impaired.

auditory scene synthesis

spatial audio


'virtual reality

cognitive acoustics

illusory self-motion


multisensory optimization

VK 1121, Sven Hultins gata 6, Chalmers University of Technology
Opponent: Prof. Jack Loomis, Department of Psychology, University of California, Santa Barbara, USA


Alexander Väljamäe

Chalmers, Bygg- och miljöteknik, Teknisk akustik, Rumsakustik

Vibrotactile enhancement of auditory induced self-motion and spatial presence

Journal of Acoustic Engineering Society,; Vol. 54(2006)p. 954-963

Artikel i vetenskaplig tidskrift


Människa-datorinteraktion (interaktionsdesign)



Doktorsavhandlingar vid Chalmers tekniska högskola. Ny serie: 2668

VK 1121, Sven Hultins gata 6, Chalmers University of Technology

Opponent: Prof. Jack Loomis, Department of Psychology, University of California, Santa Barbara, USA

Mer information

Senast uppdaterat