For people with upper extremity motor impairments, interaction with mobile devices is challenging because it relies on the use of the touchscreen. Existing assistive solutions replace inaccessible touchscreen interactions with sequences of simpler and accessible ones. However, the resulting sequence takes longer to perform than the original interaction, and therefore it is unsuitable for mobile video games. In this paper, we expand our prior work on accessible interaction substitutions for video games with a new interaction modality: using facial gestures. Our approach allows users to play existing mobile video games using custom facial gestures. The gestures are defined by each user according to their own needs, and the system is trained with a small number of face gesture samples collected from the user. The recorded gestures are then mapped to the touchscreen interactions required to play a target game. Each interaction corresponds to a single face gesture, making this approach suitable for the interaction with video games. We describe the facial gesture recognition pipeline, motivating the implementation choices through preliminary experiments conducted on example videos of face gestures collected by one user without impairments. Preliminary results show that an accurate classification of facial gestures (97%) is possible even with as few as 5 samples of the user.

Personalized Facial Gesture Recognition for Accessible Mobile Gaming / M. Manzoni, D. Ahmetovic, S. Mascetti (LECTURE NOTES IN COMPUTER SCIENCE). - In: International Conference on Computers Helping People with Special Needs[s.l] : Springer, 2024. - ISBN 9783031628450. - pp. 120-127 (( Intervento presentato al 19. convegno ICCHP International Conference on Computers Helping People with Special Needs : July, 08 - 12 tenutosi a Linz (AUT) nel 2024 [10.1007/978-3-031-62846-7_15].

Personalized Facial Gesture Recognition for Accessible Mobile Gaming

D. Ahmetovic
Penultimo
;
S. Mascetti
Ultimo
2024

Abstract

For people with upper extremity motor impairments, interaction with mobile devices is challenging because it relies on the use of the touchscreen. Existing assistive solutions replace inaccessible touchscreen interactions with sequences of simpler and accessible ones. However, the resulting sequence takes longer to perform than the original interaction, and therefore it is unsuitable for mobile video games. In this paper, we expand our prior work on accessible interaction substitutions for video games with a new interaction modality: using facial gestures. Our approach allows users to play existing mobile video games using custom facial gestures. The gestures are defined by each user according to their own needs, and the system is trained with a small number of face gesture samples collected from the user. The recorded gestures are then mapped to the touchscreen interactions required to play a target game. Each interaction corresponds to a single face gesture, making this approach suitable for the interaction with video games. We describe the facial gesture recognition pipeline, motivating the implementation choices through preliminary experiments conducted on example videos of face gestures collected by one user without impairments. Preliminary results show that an accurate classification of facial gestures (97%) is possible even with as few as 5 samples of the user.
Face gestures recognition; Mobile devices; Upper extremity motor impairments; Video games;
Settore INFO-01/A - Informatica
2024
Johannes Kepler Universitaet Linz
Book Part (author)
File in questo prodotto:
File Dimensione Formato  
manzoni2024personalized.pdf

embargo fino al 05/07/2025

Tipologia: Post-print, accepted manuscript ecc. (versione accettata dall'editore)
Dimensione 516.13 kB
Formato Adobe PDF
516.13 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
978-3-031-62846-7_15.pdf

accesso riservato

Tipologia: Publisher's version/PDF
Dimensione 562.29 kB
Formato Adobe PDF
562.29 kB Adobe PDF   Visualizza/Apri   Richiedi una copia
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/1122281
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 0
  • ???jsp.display-item.citation.isi??? 0
social impact