The development and application of empathic virtual agents is rising fast in many fields, from rehabilitation to education, from men- tal health to personal wellbeing. The empathic agents should be devel- oped to appropriately react to the user’s affective state, with the aim of establishing an emotional connection with him/her. We propose a po- sition paper to shape the design of an Empathic Virtual Character to be included in an existing platform with exer-games supporting postu- ral rehabilitation. The character will express emotions to the user with facial animations and speech statements. The character’s emotion to ex- press will be based on the current user’s affective state, inferred by the input data. Finally, we propose a possible improvement of the developed interaction framework.
Multimodal Empathic Feedback Through a Virtual Character / E. Chitti, M. Pezzera, N.A. Borghese (LECTURE NOTES IN COMPUTER SCIENCE). - In: Pattern Recognition. ICPR International Workshops and Challenges / [a cura di] A. Del Bimbo, R. Cucchiara, S. Sclaroff, G.M. Farinella, T. Mei, M. Bertini, H.J. Escalante, R. Vezzani. - [s.l] : Springer, 2021. - ISBN 978-3-030-68789-2. - pp. 156-162 (( convegno ICPR tenutosi a Milano nel 2021 [10.1007/978-3-030-68790-8_13].
Multimodal Empathic Feedback Through a Virtual Character
E. ChittiPrimo
;M. PezzeraSecondo
;N.A. Borghese
Ultimo
2021
Abstract
The development and application of empathic virtual agents is rising fast in many fields, from rehabilitation to education, from men- tal health to personal wellbeing. The empathic agents should be devel- oped to appropriately react to the user’s affective state, with the aim of establishing an emotional connection with him/her. We propose a po- sition paper to shape the design of an Empathic Virtual Character to be included in an existing platform with exer-games supporting postu- ral rehabilitation. The character will express emotions to the user with facial animations and speech statements. The character’s emotion to ex- press will be based on the current user’s affective state, inferred by the input data. Finally, we propose a possible improvement of the developed interaction framework.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.