Location
https://www.kennesaw.edu/ccse/events/computing-showcase/sp26-cday-program.php
Document Type
Event
Start Date
22-4-2026 4:00 PM
Description
How emotions influence learning experiences in virtual reality (VR) remains unclear. To address this gap, we investigated the impact of emotionally elicited lectures on learners’ experiences and outcomes in a pedagogical VR classroom using a between-subjects design with two emotional conditions (positive vs. negative). Emotional cues were delivered by a virtual agent (VA) teacher through bodily gestures and verbal expressions during lecture delivery. In a user study (N=34), we collected multimodal data, including neurophysiological measures such as electroencephalography (EEG), galvanic skin response (GSR), heart rate (HR), heart rate variability (HRV), skin temperature, and eye gaze, along with self-reported emotion, learning experience, and quiz performance. The results showed significant differences in neurophysiological responses, particularly in EEG, GSR, HR, and temperature, across emotional conditions. However, no significant differences were observed in subjective emotional responses, learning experiences, or quiz performance. These findings suggest that emotionally expressive virtual agents can elicit measurable neurophysiological responses during VR learning, even when participants are not consciously aware of these changes, highlighting the importance of considering implicit emotional responses when designing affect-aware VR learning environments.
Included in
GRP-163-229 Emotion Elicitation via Empathic AI (E-AI) Agent Teacher in VR Classroom
https://www.kennesaw.edu/ccse/events/computing-showcase/sp26-cday-program.php
How emotions influence learning experiences in virtual reality (VR) remains unclear. To address this gap, we investigated the impact of emotionally elicited lectures on learners’ experiences and outcomes in a pedagogical VR classroom using a between-subjects design with two emotional conditions (positive vs. negative). Emotional cues were delivered by a virtual agent (VA) teacher through bodily gestures and verbal expressions during lecture delivery. In a user study (N=34), we collected multimodal data, including neurophysiological measures such as electroencephalography (EEG), galvanic skin response (GSR), heart rate (HR), heart rate variability (HRV), skin temperature, and eye gaze, along with self-reported emotion, learning experience, and quiz performance. The results showed significant differences in neurophysiological responses, particularly in EEG, GSR, HR, and temperature, across emotional conditions. However, no significant differences were observed in subjective emotional responses, learning experiences, or quiz performance. These findings suggest that emotionally expressive virtual agents can elicit measurable neurophysiological responses during VR learning, even when participants are not consciously aware of these changes, highlighting the importance of considering implicit emotional responses when designing affect-aware VR learning environments.