The study of multisensory perception has traditionally emphasized the integration of temporal and spatial aspects of events in nonaction settings. We adopted an action-based paradigm to investigate the factors affecting how nonspatial and nontemporal multisensory information is integrated. On a trial, participants struck a virtual object with a constant velocity and received feedback on correctness. When a performance criterion was reached, feedback was eliminated, the properties of the stimulus were changed, and the effects on striking velocity and performance were measured. In Experiment 1, we studied the effects of the congruence of multisensory information and of a participant's expertise in the task. We studied a unimodal or a multisensory audio-haptic display in which the haptic and sound hardness of the object were manipulated. In multisensory trials, the audio-haptic changes could be congruent (e.g., both increased in hardness) or incongruent. We recruited participants with different levels of expertise with the task: percussionists, nonpercussionist musicians and nonmusicians. Overall, striking velocity decreased with an increase in both haptic and sound hardness. The level of expertise influenced the effects of haptic, but not of sound, hardness, where only percussionists struck harder haptic objects faster. For all participants, striking velocity in the multisensory trials was most strongly affected by changes in haptic hardness. Further, the effects of hardness were much more similar across participants in the auditory than in the haptic modality. Multisensory congruence modulated the effects of sound but not haptic hardness: whereas in congruent trials the effects of audio hardness were the same as in the unimodal condition, audio hardness was behaviorally irrelevant when it varied in opposition to haptic hardness. Overall, performance did not improve from the unimodal to the multisensory context. Only for nonmusicians, performance was significantly better in the audio-only condition. In summary, the effects of the least relevant modality, audition, were more similar across individuals, were independent of expertise and were modulated by multisensory congruence. On the contrary, the effects of the primary modality, haptics, varied more across individuals, were influenced by expertise, and were independent of multisensory congruence. In Experiment 2, we assessed the behavioral relevance of a visual property of the display. Nonmusicians were presented with either a visual or a visual-audio-haptic stimulus. We manipulated the speed of the visual striking object, and, in multisensory congruent trials, also the sound and haptic hardness. Participants wore a head mounted display. Striking velocity decreased for increasing haptic and sound hardness, and for decreasing speeds of the striking object. Future investigations will extend the results of Experiment 1 to visual-audio-haptic contexts.

Integrating nonspatial, nontemporal multisensory information in action-based perception / B. Giordano, F. Avanzini, M. Wanderley, S. Mcadams - In: Proc. 10th Int. Multisensory Research Forum (IMRF)[s.l] : IMRF, 2009. (( convegno International Multisensory Research Forum (IMRF) tenutosi a New York nel 2009.

Integrating nonspatial, nontemporal multisensory information in action-based perception

F. Avanzini;
2009

Abstract

The study of multisensory perception has traditionally emphasized the integration of temporal and spatial aspects of events in nonaction settings. We adopted an action-based paradigm to investigate the factors affecting how nonspatial and nontemporal multisensory information is integrated. On a trial, participants struck a virtual object with a constant velocity and received feedback on correctness. When a performance criterion was reached, feedback was eliminated, the properties of the stimulus were changed, and the effects on striking velocity and performance were measured. In Experiment 1, we studied the effects of the congruence of multisensory information and of a participant's expertise in the task. We studied a unimodal or a multisensory audio-haptic display in which the haptic and sound hardness of the object were manipulated. In multisensory trials, the audio-haptic changes could be congruent (e.g., both increased in hardness) or incongruent. We recruited participants with different levels of expertise with the task: percussionists, nonpercussionist musicians and nonmusicians. Overall, striking velocity decreased with an increase in both haptic and sound hardness. The level of expertise influenced the effects of haptic, but not of sound, hardness, where only percussionists struck harder haptic objects faster. For all participants, striking velocity in the multisensory trials was most strongly affected by changes in haptic hardness. Further, the effects of hardness were much more similar across participants in the auditory than in the haptic modality. Multisensory congruence modulated the effects of sound but not haptic hardness: whereas in congruent trials the effects of audio hardness were the same as in the unimodal condition, audio hardness was behaviorally irrelevant when it varied in opposition to haptic hardness. Overall, performance did not improve from the unimodal to the multisensory context. Only for nonmusicians, performance was significantly better in the audio-only condition. In summary, the effects of the least relevant modality, audition, were more similar across individuals, were independent of expertise and were modulated by multisensory congruence. On the contrary, the effects of the primary modality, haptics, varied more across individuals, were influenced by expertise, and were independent of multisensory congruence. In Experiment 2, we assessed the behavioral relevance of a visual property of the display. Nonmusicians were presented with either a visual or a visual-audio-haptic stimulus. We manipulated the speed of the visual striking object, and, in multisensory congruent trials, also the sound and haptic hardness. Participants wore a head mounted display. Striking velocity decreased for increasing haptic and sound hardness, and for decreasing speeds of the striking object. Future investigations will extend the results of Experiment 1 to visual-audio-haptic contexts.
Settore INF/01 - Informatica
Settore ING-INF/05 - Sistemi di Elaborazione delle Informazioni
2009
Book Part (author)
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/2434/657896
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact