This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.
Real-time auditory-visual distance rendering for a virtual reaching task / L. Mion, F. Avanzini, B. Mantel, B. Bardy, T. Stoffregen - In: VRST '07 : Proceedings[s.l] : ACM, 2007. - ISBN 9781595938633. - pp. 179-182 (( convegno Symposium on Virtual Reality Software and Technology tenutosi a Newport Beach nel 2007.
Real-time auditory-visual distance rendering for a virtual reaching task
F. Avanzini;
2007
Abstract
This paper reports on a study on the perception and rendering of distance in multimodal virtual environments. A model for binaural sound synthesis is discussed, and its integration in a real-time system with motion tracking and visual rendering is presented. Results from a validation experiment show that the model effectively simulates relevant auditory cues for distance perception in dynamic conditions. The model is then used in a subsequent experiment on the perception of egocentric distance. The design and preliminary result from this experiment are discussed.File | Dimensione | Formato | |
---|---|---|---|
mion_vrst07.pdf
accesso riservato
Tipologia:
Publisher's version/PDF
Dimensione
273.11 kB
Formato
Adobe PDF
|
273.11 kB | Adobe PDF | Visualizza/Apri Richiedi una copia |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.