This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magnetoencephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classifica- tion of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips.
User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses
Kia, Seyed Mostafa;Avesani, Paolo;
2013-01-01
Abstract
This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magnetoencephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classifica- tion of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.