Electroencephalography (EEG)-based emotion recognition is currently a hot issue in the affective computing community. Numerous studies have been published on this topic, following generally the same schema: 1) presentation of emotional stimuli to a number of subjects during the recording of their EEG, 2) application of machine learning techniques to classify the subjects' emotions. The proposed approaches vary mainly in the type of features extracted from the EEG and in the employed classifiers, but it is difficult to compare the reported results due to the use of different datasets. In this paper, we present a new database for the analysis of valence (positive or negative emotions), which is made publicly available. The database comprises physiological recordings and 257-channel EEG data, contrary to all previously published datasets, which include at most 62 EEG channels. Furthermore, we reconstruct the brain activity on the cortical surface by applying source localization techniques. We then compare the performances of valence classification that can be achieved with various features extracted from all source regions (source space features) and from all EEG channels (sensor space features), showing that the source reconstruction improves the classification results. Finally, we discuss the influence of several parameters on the classification scores.