Skip to content. |
Skip to navigation
This is a pilot database representing the first stage of the final HUMAINE database which will be available at the end of 2007. It consists of an exemplar set of 50 clips from naturalistic and induced data showing a range of modalities and emotions, and a subset of 16 clips selected to cover a balanced sample of emotional behaviour in a range of contexts. This subset is labelled. It is mounted on the ANVIL platform.
English with a small set of French and Hebrew clips
contact either firstname.lastname@example.org or email@example.com
the subset of 16 labelled clips have ethical clearance
HUMAINE is concerned with developing interfaces for machines that will register and respond to emotion. Emotion in the HUMAINE context means more than dramatic brief episodes of emotion. It is very much focused on pervasive emotion, that is, the feelings and action tendencies and forms of expression that colour most of human life. HUMAINE is also committed to working with data that is as naturalistic as possible, although it recognises that it may need to approach naturalistic data in a series of progressive steps (from easily tractable data to more challenging data). The aim of the database exemplar is to collect and provide examples of the types of data that need to be considered in that context and to develop and make available a labelling scheme appropriate to the data and the long term needs of the users.
.avi files readable in ANVIL (see www.dfki.de/~kipp/anvil/
datafiles containing emotion labels, gesture labels, speech labels and FAPS all readble in ANVIL
Variety of methods used covering various induction techniques and naturalistic data
These are described at http://emotion-research.net/deliverables/D5g20%final.pdf
Induced and natural
50 clips ranging from 5secs to 3minutes
(i) Global labels applied across a whole emotion episode or 'clip'. There are 8 global descriptive categories applied - emotion words, emotion-related states, combination types, authenticity, core affect dimensions, context labels, key events: what the emotion is about etc, appraisal categories.
(ii) Time-aligned emotion labels assigned using the TRACE set of programs designed at Queen's University Belfast. Raters use a mouse to trace his/her perceived impressions of the emotional state of the speaker in the clip continuously over time on a one dimensional axis (e.g. intensity, activation/arousal, valence, power). Traces are available on seven dimensions described in detail at http://emotion-research.net/deliverables/D5g20%final.pdf
6 labellers have labelled 16 clips using the descriptors above.
IEEE Transactions on
Submit a manuscript