This paper presents a multimodal interactive system for non-visual (auditory-haptic) exploration of virtual maps. The system is able to display haptically the height profile of a map, through a tactile mouse. Moreover, spatial auditory information is provided in the form of virtual anchor sounds located in specific points of the map, and delivered through headphones using customized Head-Related Transfer Functions (HRTFs). The validity of the proposed approach is investigated through two experiments on non-visual exploration of virtual maps. The first experiment has a preliminary nature and is aimed at assessing the effectiveness and the complementarity of auditory and haptic information in a goal reaching task. The second experiment investigates the potential of the system in providing subjects with spatial knowledge: specifically in helping with the construction of a cognitive map depicting simple geometrical objects. Results from both experiments show that the proposed concept, design, and implementation allow to effectively exploit the complementary natures of the “proximal” haptic modality and the “distal” auditory modality. Implications for orientation & mobility (O&M) protocols for visually impaired subjects are discussed.

Interactive spatial sonification for non-visual exploration of virtual maps / M. Geronazzo, A. Bedin, L. Brayda, C. Campus, F. Avanzini. - In: INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES. - ISSN 1071-5819. - 85(2016), pp. 4-15.

Interactive spatial sonification for non-visual exploration of virtual maps

F. Avanzini
2016

Abstract

This paper presents a multimodal interactive system for non-visual (auditory-haptic) exploration of virtual maps. The system is able to display haptically the height profile of a map, through a tactile mouse. Moreover, spatial auditory information is provided in the form of virtual anchor sounds located in specific points of the map, and delivered through headphones using customized Head-Related Transfer Functions (HRTFs). The validity of the proposed approach is investigated through two experiments on non-visual exploration of virtual maps. The first experiment has a preliminary nature and is aimed at assessing the effectiveness and the complementarity of auditory and haptic information in a goal reaching task. The second experiment investigates the potential of the system in providing subjects with spatial knowledge: specifically in helping with the construction of a cognitive map depicting simple geometrical objects. Results from both experiments show that the proposed concept, design, and implementation allow to effectively exploit the complementary natures of the “proximal” haptic modality and the “distal” auditory modality. Implications for orientation & mobility (O&M) protocols for visually impaired subjects are discussed.
3D audio; Binaural sound; Haptic mouse; Haptics; Multimodal interaction; Multisensory integration; Non-visual navigation; Virtual maps; Visual impairment; Hardware and Architecture; Engineering (all); Software; Human-Computer Interaction; Human Factors and Ergonomics; 3304
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
2016
Article (author)
File in questo prodotto:
File Dimensione Formato  
geronazzo_ijhcs16.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 1.39 MB
Formato Adobe PDF
1.39 MB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/653949
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 42
  • ???jsp.display-item.citation.isi??? 34
social impact