Personal tools
You are here: Home Toolbox MMI Facial Expression Database
Log in

Forgot your password?
Document Actions

MMI Facial Expression Database


The MMI Facial Expression database holds over 2000 videos and over 500 images of about 50 subjects displaying various facial expressions on command. Along with the data samples, metadata in terms of displayed AUs is available. More datasamples, subjects and metadata is being added on a regular basis, including recording of spontaneous facial behaviour.



- M. Pantic, M.F. Valstar, R. Rademaker and L. Maat, "Web-based Database for Facial Expression Analysis", Proc. IEEE Int'l Conf. Multmedia and Expo (ICME'05), Amsterdam, The Netherlands, July 2005, DOI: 10.1109/ICME.2005.1521424


Fill in the form at the above mentioned web site, sign and send the EULA ( and an account for the database is created. The database is fully web-based, meaning that browsing through the DB, searching
specific samples, and downloading samples is done via the Internet.

Fact Sheet
Categorized by modality:
Categorized by descriptor:
Face: Feature points and Action Units (AUs)
Purpose of collection:

Providing input and ground truth to facial feature extraction and expression recognition algorithms


24-bit images (frontal-, profile- and dual-view recordings) of 720×576 pixels, 24 frames per second, PAL format

Elicitation method:

The subjects were asked to display 79 series of expressions that included either a single AU (e.g., AU2) or a combination of a minimal number of AUs (e.g., AU8 cannot be displayed without AU25) or a prototypic combination of AUs (such as in expressions of emotion). Also, a short neutral state is available at the beginning and at the end of each expression.

Nature of material:

52 different faces of both sexes (48% female), age from 19 to 62, European, Asian, or South American ethnic background - natural lighting and variable backgrounds (for some samples)

Type of emotional description:

Action Units, metadata (data format, facial view, shown AU, shown emotion, gender, age), analysis of AU temporal activation patterns (onset -> apex -> offset)

How was emotion determined?:

Expert annotator

Emotional content:

Single AU and multiple AU activation

Powered by Plone

Portal usage statistics