Smartphones are accessible to persons with visual impairment or blindness (VIB): screen reader technologies, integrated with mobile operating systems, enable non-visual interaction with the device. Also, features like GPS receivers, inertial sensors and cameras enable the development of Mobile Assistive Technologies (MATs) to support people with VIB. A preliminary analysis, conducted adopting an user-centric approach, highlighted some issues experienced by people with VIB in everyday activities from three main fields: orientation, mobility and access to images. Traditional approaches to address these issues, based on assistive tools and technologies, have some limitations: in the field of mobility, for example, existing navigation support solutions (e.g. the white cane) cannot be used to perceive some environmental features like crosswalks or the current state of traffic lights; in the field of orientation, tactile maps adopted to develop cognitive maps of the environment are limited in the amount of information that can be represented on a single surface and by the lack of interactivity, two issues experienced also in other fields where access to graphical information is of paramount importance like, for example, didactics of STEM subjects. This work presents new MATs that deal with these limitations by introducing novel solutions in different fields of Computer Science. Original computer vision techniques, designed to detect the presence of pedestrian crossings and the state of traffic lights, are used to sense information from the environment and support mobility of people with VIB. Novel sonification techniques are introduced to efficiently convey information with three different goals: first, to convey guidance information in urban crossings; second, to enhance the development of cognitive maps by augmenting tactile surfaces; third, to enable quick access to images. Experience reported in this dissertation shows that the proposed MATs are effective in supporting people with VIB and, in general, that mobile devices are a versatile platform to enable affordable and pervasive access to assistive technologies. Involving target users in the evaluation of MATs emerged as a major challenge in this work. However, it is shown how such challenge can be addressed by adopting large scale evaluation techniques typical of HCI research.

MOBILE ASSISTIVE TECHNOLOGIES FOR PEOPLE WITH VISUAL IMPAIRMENT: SENSING AND CONVEYING INFORMATION TO SUPPORT ORIENTATION, MOBILITY AND ACCESS TO IMAGES / A. Gerino ; advisor: S.Mascetti ; co-advisor: C.Bernareggi ; school director: P. Boldi. DIPARTIMENTO DI INFORMATICA, 2017 Feb 28. 29. ciclo, Anno Accademico 2016. [10.13130/gerino-andrea_phd2017-02-28].

MOBILE ASSISTIVE TECHNOLOGIES FOR PEOPLE WITH VISUAL IMPAIRMENT: SENSING AND CONVEYING INFORMATION TO SUPPORT ORIENTATION, MOBILITY AND ACCESS TO IMAGES

A. Gerino
2017

Abstract

Smartphones are accessible to persons with visual impairment or blindness (VIB): screen reader technologies, integrated with mobile operating systems, enable non-visual interaction with the device. Also, features like GPS receivers, inertial sensors and cameras enable the development of Mobile Assistive Technologies (MATs) to support people with VIB. A preliminary analysis, conducted adopting an user-centric approach, highlighted some issues experienced by people with VIB in everyday activities from three main fields: orientation, mobility and access to images. Traditional approaches to address these issues, based on assistive tools and technologies, have some limitations: in the field of mobility, for example, existing navigation support solutions (e.g. the white cane) cannot be used to perceive some environmental features like crosswalks or the current state of traffic lights; in the field of orientation, tactile maps adopted to develop cognitive maps of the environment are limited in the amount of information that can be represented on a single surface and by the lack of interactivity, two issues experienced also in other fields where access to graphical information is of paramount importance like, for example, didactics of STEM subjects. This work presents new MATs that deal with these limitations by introducing novel solutions in different fields of Computer Science. Original computer vision techniques, designed to detect the presence of pedestrian crossings and the state of traffic lights, are used to sense information from the environment and support mobility of people with VIB. Novel sonification techniques are introduced to efficiently convey information with three different goals: first, to convey guidance information in urban crossings; second, to enhance the development of cognitive maps by augmenting tactile surfaces; third, to enable quick access to images. Experience reported in this dissertation shows that the proposed MATs are effective in supporting people with VIB and, in general, that mobile devices are a versatile platform to enable affordable and pervasive access to assistive technologies. Involving target users in the evaluation of MATs emerged as a major challenge in this work. However, it is shown how such challenge can be addressed by adopting large scale evaluation techniques typical of HCI research.
28-feb-2017
Settore INF/01 - Informatica
MASCETTI, SERGIO
BOLDI, PAOLO
Doctoral Thesis
MOBILE ASSISTIVE TECHNOLOGIES FOR PEOPLE WITH VISUAL IMPAIRMENT: SENSING AND CONVEYING INFORMATION TO SUPPORT ORIENTATION, MOBILITY AND ACCESS TO IMAGES / A. Gerino ; advisor: S.Mascetti ; co-advisor: C.Bernareggi ; school director: P. Boldi. DIPARTIMENTO DI INFORMATICA, 2017 Feb 28. 29. ciclo, Anno Accademico 2016. [10.13130/gerino-andrea_phd2017-02-28].
File in questo prodotto:
File Dimensione Formato  
phd_unimi_R10579.pdf

accesso aperto

Tipologia: Tesi di dottorato completa
Dimensione 10.81 MB
Formato Adobe PDF
10.81 MB Adobe PDF Visualizza/Apri
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/476683
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact