This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magnetoencephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classifica- tion of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips.
Scheda prodotto non validato
Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte di FBK.
Titolo: | User-centric Affective Video Tagging from MEG and Peripheral Physiological Responses |
Autori: | |
Data di pubblicazione: | 2013 |
Abstract: | This paper presents a new multimodal database and the associated results for characterization of affect (valence, arousal and dominance) using the Magnetoencephalogram (MEG) brain signals and peripheral physiological signals (horizontal EOG, ECG, trapezius EMG). We attempt single-trial classifica- tion of affect in movie and music video clips employing emotional responses extracted from eighteen participants. The main findings of this study are that: (i) the MEG signal effectively encodes affective viewer responses, (ii) clip arousal is better predicted by MEG, while peripheral physiological signals are more effective for predicting valence and (iii) prediction performance is better for movie clips as compared to music video clips. |
Handle: | http://hdl.handle.net/11582/199010 |
ISBN: | 9780769550480 |
Appare nelle tipologie: | 4.1 Contributo in Atti di convegno |