Personal tools
You are here: Home Toolbox HUMAINE Database Stage 1
Navigation
Log in


Forgot your password?
 
Views
Document Actions

HUMAINE Database Stage 1

Overview
Description:

This is a pilot database representing the first stage of the final HUMAINE database which will be available at the end of 2007. It consists of an exemplar set of 50 clips from naturalistic and induced data showing a range of modalities and emotions, and a subset of 16 clips selected to cover a balanced sample of emotional behaviour in a range of contexts. This subset is labelled. It is mounted on the ANVIL platform.

Contact:

c.cox@qub.ac.uk

Language:

English with a small set of French and Hebrew clips

Access:

contact either c.cox@qub.ac.uk or e.douglas-cowie@qub.ac.uk

Fact Sheet
Categorized by modality:
AV Gesture
Categorized by descriptor:
Speech: words, pitch traces, auditory-based paralinguistic descriptors
Face: FAPS currently for one clip only
Gesture: manual annotation scheme applied to coding expressive gestures (see Abrilian et al. 2005, Martin et al. 2005)
Ethical Clearance Issues:

the subset of 16 labelled clips have ethical clearance

Purpose of collection:

HUMAINE is concerned with developing interfaces for machines that will register and respond to emotion. Emotion in the HUMAINE context means more than dramatic brief episodes of emotion. It is very much focused on pervasive emotion, that is, the feelings and action tendencies and forms of expression that colour most of human life. HUMAINE is also committed to working with data that is as naturalistic as possible, although it recognises that it may need to approach naturalistic data in a series of progressive steps (from easily tractable data to more challenging data). The aim of the database exemplar is to collect and provide examples of the types of data that need to be considered in that context and to develop and make available a labelling scheme appropriate to the data and the long term needs of the users.

Format:

.avi files readable in ANVIL (see www.dfki.de/~kipp/anvil/
datafiles containing emotion labels, gesture labels, speech labels and FAPS all readble in ANVIL

Elicitation method:

Variety of methods used covering various induction techniques and naturalistic data

These are described at http://emotion-research.net/deliverables/D5g20%final.pdf

Nature of material:

Induced and natural

Size:

50 clips ranging from 5secs to 3minutes

Type of emotional description:

(i) Global labels applied across a whole emotion episode or 'clip'. There are 8 global descriptive categories applied - emotion words, emotion-related states, combination types, authenticity, core affect dimensions, context labels, key events: what the emotion is about etc, appraisal categories.

(ii) Time-aligned emotion labels assigned using the TRACE set of programs designed at Queen's University Belfast. Raters use a mouse to trace his/her perceived impressions of the emotional state of the speaker in the clip continuously over time on a one dimensional axis (e.g. intensity, activation/arousal, valence, power). Traces are available on seven dimensions described in detail at http://emotion-research.net/deliverables/D5g20%final.pdf

6 labellers have labelled 16 clips using the descriptors above.

Powered by Plone

Portal usage statistics