About

The automatic recognition of human emotions is of great interest in the context of multimedia applications and brain-computer interfaces. While users’ emotions can be assessed based on questionnaires, the results may be biased because the answers could be influenced by social expectations. More objective measures of emotions can be obtained by studying the users' physiological responses. The present database has been constructed in particular to evaluate the usefulness of electroencephalography (EEG) for emotion recognition in the context of audio-visual stimuli, but it also contains simultaneous physiological recordings (electrocardiogram, respiration, blood oxygen level, pulse rate, galvanic skin response) in addition to the EEG data. To the best of our knowledge, this database is the first publicly available one containing high-resolution (HR) EEG recordings for studies of emotion.

Experiment

In order to study the emotions elicited by audio-visual stimuli (film excerpts), we recruited 40 subjects to participate in an experiment. During the experiment, we presented 13 emotional videos (7 videos for positive emotions and 6 videos for negative emotions) and 13 neutral videos to the subjects. The subjects were comfortably installed on a chair at about 1m distance from the 21'' computer which was used for the presentation of the stimuli.

Unknown block type "image", specify a component for it in the `components.types` option

The experiment was implemented in E-Prime 2.0 (Psychology Software Tools, Pittsburgh, PA) according to the experimental protocol shown in Figure 1. Except for the test trial, both the neutral videos and the emotional videos were presented in an arbitrary order. The emotional stimuli were selected from the FilmStim database (A. Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: a new tool for emotion researchers,” Cognition & Emotion, vol. 24, no. 7, pp. 1153–1172, 2010, http://nemo.psp.ucl.ac.be/FilmStim/) and were between 40 s and 6 min long. After each emotional video, the subjects evaluated the emotions felt during the video on a negative (-1) - neutral (0) - positive (1) discrete scale. HR-EEG recordings and physiological measurements (1-channel electrocardiogram, respiration, blood oxygen level, pulse rate) were acquired with a 257-channel EGI system (EGI, Electrical Geodesics Inc., Eugene, USA) at a sampling rate of 1000 Hz and galvanic skin response measurements were recorded simultaneously using a SenseWear sensor (BodyMedia Inc.) with a sampling rate of 31 Hz. The experiment took place in a shielded room in Pontchaillou hospital, Rennes, France, and was approved by the local ethical review board of Inserm.

Unknown block type "image", specify a component for it in the `components.types` option

The database includes the EEG and other physiological recordings of the 40 subjects collected during the viewing of neutral and emotional videos and for the black screen periods. The data are provided in Matlab file format. Further information on the subjects (age and gender), the individual ratings of the videos (self-assessment labels), and the experiment (order of presentation of the videos) is also included in the Matlab files, along with all the necessary details on preprocessing and data analysis required to reproduce the results presented in (H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016). Moreover, we provide a Matlab script to read this data. More details about the data and the contents of the Matlab files are given here.


Citing the database

Please cite the following paper in your publications making use of the HR-EEG4EMO database:

H. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016

Description

In the following, we describe the contents of all provided Matlab files and provide additional information on the data recorded for each subject.

Selected emotional stimuli

The following table lists describe the 13 emotional videos that have been selected from the FilmStim database [1] (http://nemo.psp.ucl.ac.be/FilmStim/ ). The descriptions of the scenes are reproduced from FilmStim.

Unknown block type "table", specify a component for it in the `components.types` option

Summary and comments of subject data

The following table provides some information on the data provided for each subject. Note that the evaluation of the EEG quality is subjective. The suitability of the subject for inclusion in the EEG-based valence recognition analysis is rated on a scale from 1 (very good) to 6 (very bad) and is also subjective. The number of videos corresponds to the number of videos for which the subject reported to have felt positive or negative emotions. For the subjects for which marker signals for the beginning and ends of the videos were not present in the EEG, the synchronization was performed based on the timestamps of the EEG recordings and the E-Prime files.

Unknown block type "table", specify a component for it in the `components.types` option

Physiological recordings

For each subject, we provide the following data files:

Subject information:

Unknown block type "table", specify a component for it in the `components.types` option

Data:

Unknown block type "table", specify a component for it in the `components.types` option


Subject’s responses:

Unknown block type "table", specify a component for it in the `components.types` option


Information about the experiment:

Unknown block type "table", specify a component for it in the `components.types` option


Information relevant for the data analysis conducted in the associated journal paper:

Unknown block type "table", specify a component for it in the `components.types` option

Files for source reconstruction

In order to enable other researchers to reproduce the results described in [2] based on the reconstruction of brain activity on the cortical surface, we provide the following files that essentially characterize the employed head model:

Unknown block type "table", specify a component for it in the `components.types` option

Matlab script

Along with the data, we provide two Matlab scripts, one for reading the data and the second one for extracting the features described in [2].

The script read_data includes the following functionalities:

  • load the data
  • display subject information
  • list information of analyzed data intervals
  • display the presentation order of neutral and emotional videos
  • list the indices of bad channels
  • visualize the GSR signals for each emotional video
  • filter EEG data
  • interpolate bad EEG channels from the 4 nearest sensors
  • extract physiological recordings from data array
  • combine EEG and all physiological recordings (including the GSR) in one matrix

The script feature_extraction permits to compute the following features for each EEG channel and for each emotional brain region (see [2] for more details):

  • powers in (4 – 8 Hz), (8 – 13 Hz), (13 – 30 Hz), low (30 – 45 Hz), and high (55 – 80 Hz) bands
  • connectivity in, low, and high bands
  • connectivity in the whole frequency range
  • higher order crossings for, low, and high bands up to derivative of order 20
  • higher order crossings for the whole frequency range up to derivative of order 50
  • fractal dimension
  • statistics (minimum, maximum, median, standard deviation, mean and maximum of the first two derivatives, skewness, kurtosis)
  • spectral moments
  • spectral crest factor in, low, and high bands

and to vary the following parameters:

  • number of sensors
  • length of the considered data interval
  • number of segments per interval
  • length of the analyzed segment
  • application of Independent Component Analysis (ICA) with Wavelet Denoising (WD) or not.

References

Schaefer, F. Nils, X. Sanchez, and P. Philippot, “Assessing the effectiveness of a large database of emotion-eliciting films: a new tool for emotion researchers,” Cognition & Emotion, vol. 24, no. 7, pp. 1153–1172, 2010

  • Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016

Download

Download page

To download the database, we ask you to provide your name, email address and affiliation and to fill in and sign the EULA (End User License Agreement) form available here, by which you agree to the terms of use described below. Then send an email to eeg4management@interdigital.com asking for the database, with the EULA file attached and the above-required information. You will receive instructions on how to download the dataset via the provided email address.

We may store the data you supplied in order to contact you later about benchmark related matters. The data will not be used in any other way.

Terms of use

1. Commercial use

The user may only use the dataset for academic research. The user may not use the database for any commercial purposes. Commercial purposes include, but are not limited to:

  • proving the efficiency of commercial systems,
  • training or testing of commercial systems,
  • using screenshots of data from the database in advertisements,
  • selling data from the database,
  • creating military applications.

2. Distribution

The user may not distribute the dataset or portions thereof in any way, with the exception of using small portions of data for the exclusive purpose of clarifying academic publications or presentations. Note that publications will have to comply with the terms stated in article 4.

3. Access

The user may only use the database after this End User License Agreement (EULA) has been signed and returned to the dataset administrators. The signed EULA should be returned in digital format by including it to the mail when requesting access to the dataset. Upon receipt of the EULA, a username and password to access the dataset will be issued. The user may not grant anyone access to the database by giving out their user name and password.

4. Publications

Publications include not only papers, but also presentations for conferences or educational purposes. All documents and papers that report on research that uses the HR-EEG4EMO dataset will cite the following paper:

  1. Becker, J. Fleureau, P. Guillotel, F. Wendling, I. Merlet, and L. Albera, “Emotion recognition based on high-resolution EEG recordings and reconstructed brain sources”, submitted to IEEE Transactions on Affective Computing, 2016

5. Warranty

The database comes without any warranty. University of Rennes 1, Inserm, LTSI and InterDigital R&D France cannot be held accountable for any damage (physical, financial or otherwise) caused by the use of the database.