We investigate navigation in two dimensions with velocity control, using a single-button interface. Users adjust the controlled-object speed through rhythmic tapping and its direction by pressing and tilting, releasing the button to finalize the rotation. Feedback to control action and object motion is provided by integration of multiple sensory modalities. Tactile pulses are delivered at 30-degree intervals during rotation, emulating the detents of a rotary encoder. Simultaneously, a sonic glissando accompanies rotation, thus rendering upward or downward rotation. Both visual and auditory cues are used to provide absolute positional feedback. Discrete notes are rhythmically played, whose pitch indicates vertical position, while stereo audio panning follows horizontal position. The rhythmic pace aligns with on-screen object speed, as dictated by tapping time intervals. Two studies were designed around a target-following task, under different sensory conditions. Study 1 has shown that target tracking can be effectively performed with multisensory rhythmic interaction. This is true also when the controlled object is intermittently hidden to view, although it was not possible to measure the advantage provided by auditory-tactile feedback. In study 2, no significant performance differences were observed between auditory and tactile conditions in situations of intermittent visual feedback, indicating that, if the two kinds of non-visual feedback are effective, they are essentially equivalent.
Multisensory Trajectory Control at One Interaction Point, with Rhythm / A. Bellino, D. Rocchesso, R. Mulè, L. D'Arrigo Reitano - In: AM '24: Proceedings / [a cura di] L.A. Ludovico, D.A. Mauro. - [s.l] : ACM, 2024 Sep 18. - ISBN 979-8-4007-0968-5. - pp. 399-404 (( Intervento presentato al 19. convegno Audio Mostly tenutosi a Milano nel 2024 [10.1145/3678299.3678340].
Multisensory Trajectory Control at One Interaction Point, with Rhythm
D. Rocchesso
Secondo
;
2024
Abstract
We investigate navigation in two dimensions with velocity control, using a single-button interface. Users adjust the controlled-object speed through rhythmic tapping and its direction by pressing and tilting, releasing the button to finalize the rotation. Feedback to control action and object motion is provided by integration of multiple sensory modalities. Tactile pulses are delivered at 30-degree intervals during rotation, emulating the detents of a rotary encoder. Simultaneously, a sonic glissando accompanies rotation, thus rendering upward or downward rotation. Both visual and auditory cues are used to provide absolute positional feedback. Discrete notes are rhythmically played, whose pitch indicates vertical position, while stereo audio panning follows horizontal position. The rhythmic pace aligns with on-screen object speed, as dictated by tapping time intervals. Two studies were designed around a target-following task, under different sensory conditions. Study 1 has shown that target tracking can be effectively performed with multisensory rhythmic interaction. This is true also when the controlled object is intermittently hidden to view, although it was not possible to measure the advantage provided by auditory-tactile feedback. In study 2, no significant performance differences were observed between auditory and tactile conditions in situations of intermittent visual feedback, indicating that, if the two kinds of non-visual feedback are effective, they are essentially equivalent.File | Dimensione | Formato | |
---|---|---|---|
3678299.3678340.pdf
accesso aperto
Tipologia:
Publisher's version/PDF
Dimensione
2.27 MB
Formato
Adobe PDF
|
2.27 MB | Adobe PDF | Visualizza/Apri |
Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.