Creating believable virtual humans with dynamic emotions, personality-driven behavior, and expressive facial animations is essential for immersive virtual experiences. Traditional scripted approaches often result in rigid and predictable interactions, limiting realism. This paper presents MoodyNPCs, an AI-driven framework that enhances the believability of virtual humans by integrating personality modeling, emotional dynamics, and real-time facial expression synthesis. The system leverages the Big Five personality model [8] and Plutchik’s emotion wheel to create nuanced emotional responses that evolve based on player interactions. Emotional states are continuously updated and influence both dialogue and facial expressions, modeled using Ekman’s Facial Action Coding System (FACS) to ensure realistic representation. By dynamically blending primary emotions and adjusting their intensity, the system enables virtual humans to display more fluid and context-aware reactions. Additionally, a Large Language Model (LLM) generates adaptive dialogues that reflect the character’s emotional state and personality. The framework was tested in both Virtual Reality (VR) and non-VR environments, demonstrating that expressive facial animations significantly enhance perceived realism, with VR increasing immersion and emotional engagement. Future developments will refine emotional transitions and expand expressivity through gestures, voice modulation, and environmental interactions to create more lifelike virtual humans.
MoodyNPC: Personality & Facial Expressions for Virtual Humans / F. Martinelli, N.A. Borghese, F. Bultrini, L.A. Ripamonti, A. Zaniboni (LECTURE NOTES IN COMPUTER SCIENCE). - In: Extended Reality / [a cura di] L.T. De Paolis, P. Arpaia, M. Sacco. - [s.l] : springer, 2025. - ISBN 978-3-031-97777-0. - pp. 23-42 (( convegno XR tenutosi a Otranto nel 2025 [10.1007/978-3-031-97778-7_2].
MoodyNPC: Personality & Facial Expressions for Virtual Humans
F. Martinelli;N.A. BorgheseSecondo
;L.A. Ripamonti
Penultimo
;
2025
Abstract
Creating believable virtual humans with dynamic emotions, personality-driven behavior, and expressive facial animations is essential for immersive virtual experiences. Traditional scripted approaches often result in rigid and predictable interactions, limiting realism. This paper presents MoodyNPCs, an AI-driven framework that enhances the believability of virtual humans by integrating personality modeling, emotional dynamics, and real-time facial expression synthesis. The system leverages the Big Five personality model [8] and Plutchik’s emotion wheel to create nuanced emotional responses that evolve based on player interactions. Emotional states are continuously updated and influence both dialogue and facial expressions, modeled using Ekman’s Facial Action Coding System (FACS) to ensure realistic representation. By dynamically blending primary emotions and adjusting their intensity, the system enables virtual humans to display more fluid and context-aware reactions. Additionally, a Large Language Model (LLM) generates adaptive dialogues that reflect the character’s emotional state and personality. The framework was tested in both Virtual Reality (VR) and non-VR environments, demonstrating that expressive facial animations significantly enhance perceived realism, with VR increasing immersion and emotional engagement. Future developments will refine emotional transitions and expand expressivity through gestures, voice modulation, and environmental interactions to create more lifelike virtual humans.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.




