In the last years several solutions were proposed to support people with visual impairments or blindness during road crossing. These solutions focus on computer vision techniques for recognizing pedestrian crosswalks and computing their relative position from the user. Instead, this contribution addresses a different problem; the design of an auditory interface that can effectively guide the user during road crossing. Two original auditory guiding modes based on data sonification are presented and compared with a guiding mode based on speech messages. Experimental evaluation shows that there is no guiding mode that is best suited for all test subjects. The average time to align and cross is not significantly different among the three guiding modes, and test subjects distribute their preferences for the best guiding mode almost uniformly among the three solutions. From the experiments it also emerges that higher effort is necessary for decoding the sonified instructions if compared to the speech instructions, and that test subjects require frequent 'hints' (in the form of speech messages). Despite this, more than 2/3 of test subjects prefer one of the two guiding modes based on sonification. There are two main reasons for this: firstly, with speech messages it is harder to hear the sound of the environment, and secondly sonified messages convey information about the "quantity" of the expected movement.

Sonification of guidance data during road crossing for people with visual impairments or blindness / S. Mascetti, L. Picinali, A. Gerino, D. Ahmetovic, C. Bernareggi. - In: INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES. - ISSN 1071-5819. - 85(2016 Jan), pp. 16-26. [10.1016/j.ijhcs.2015.08.003]

Sonification of guidance data during road crossing for people with visual impairments or blindness

S. Mascetti
Primo
;
A. Gerino;D. Ahmetovic
Penultimo
;
C. Bernareggi
Ultimo
2016

Abstract

In the last years several solutions were proposed to support people with visual impairments or blindness during road crossing. These solutions focus on computer vision techniques for recognizing pedestrian crosswalks and computing their relative position from the user. Instead, this contribution addresses a different problem; the design of an auditory interface that can effectively guide the user during road crossing. Two original auditory guiding modes based on data sonification are presented and compared with a guiding mode based on speech messages. Experimental evaluation shows that there is no guiding mode that is best suited for all test subjects. The average time to align and cross is not significantly different among the three guiding modes, and test subjects distribute their preferences for the best guiding mode almost uniformly among the three solutions. From the experiments it also emerges that higher effort is necessary for decoding the sonified instructions if compared to the speech instructions, and that test subjects require frequent 'hints' (in the form of speech messages). Despite this, more than 2/3 of test subjects prefer one of the two guiding modes based on sonification. There are two main reasons for this: firstly, with speech messages it is harder to hear the sound of the environment, and secondly sonified messages convey information about the "quantity" of the expected movement.
blind people; people with visual impairments or blindness; sonification of guidance data; spatial cognition; hardware and architecture; engineering (all); software; human-computer Interaction; human factors and ergonomics; 3304
Settore INF/01 - Informatica
gen-2016
Article (author)
File in questo prodotto:
File Dimensione Formato  
preprint.pdf

Open Access dal 20/10/2017

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 960.92 kB
Formato Adobe PDF
960.92 kB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/429207
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 49
  • ???jsp.display-item.citation.isi??? 28
social impact