Skip to yearly menu bar Skip to main content


Poster

EEVR: A Virtual Reality-Based Emotion Dataset Featuring Paired Physiological Signals and Textual Descriptions

Pragya Singh · Ritvik Budhiraja · Ankush Gupta · Anshul Goswami · Mohan Kumar · Pushpendra Singh

[ ] [ Project Page ]
Thu 12 Dec 11 a.m. PST — 2 p.m. PST

Abstract:

EEVR (Emotion Elicitation in Virtual Reality) is a novel dataset specifically designed for language supervision-based pre-training and emotion recognition tasks, such as valence and arousal classification. It features high-quality physiological signals, including electrodermal activity (EDA) and photoplethysmography (PPG), acquired through emotion elicitation via 360-degree virtual reality (VR) videos. Additionally, it includes subject-wise textual descriptions of emotions experienced during each stimulus gathered from qualitative interviews. The emotional stimuli were carefully selected to induce a range of emotions covering all four quadrants of Russell's circumplex model. The dataset consists of recordings from 37 participants and is the first to pair raw text with physiological signals, providing additional contextual information that objective labels cannot offer. Baseline models for arousal, valence, and emotion classification are provided, along with code for data cleaning and feature extraction pipelines. We show that augmenting our signals with self-reported textual annotations can improve performance on physiological signal-based emotion recognition tasks. The dataset is available at https://melangelabiiitd.github.io/EEVR/.

Live content is unavailable. Log in and register to view live content