Personal tools
You are here: Home Toolbox Easy to use validated facial emotion expression for virtual characters
Navigation
Log in


Forgot your password?
 
Views
Document Actions

Easy to use validated facial emotion expression for virtual characters

Overview
Description:

This tool is intended for those who need virtual characters with valid and reliable facial emotion expression. The model does not use animations, but uses direct bone control to express and blend expressions. The expressions have been evaluated in (Broekens, Chao Qu, & Brinkman, 2012). The tool contains the following items:

1. An executable for testing and checking out the expressions that can be generated with the software and models.
2. Two example models that are instrumented to be able to express emotions on the face, according to the method described in (Broekens, et al., 2012)
3. The technical report (please refer when you use the tool, or parts of it, its method, data, etc…)
4. Three pieces of python code
a. A class to control facial expressions
b. A class to manage an emotional state.
c. An example world (the experiment used to test the faces) that can be imported and run in Vizard 3D (http://www.worldviz.com/products/vizard)

Usage of the emotional state and expression code is pretty straightforward, and setting up an avatar with emotions is a question of a couple of lines of code. Please see the class AffectManager in emotion.py, and the example code in the expressionexperiment2.py in the class Character method _init. There you will find how to control the face with keys, as well as how to setup a character with emotions.

Please note (and see the paper and example models on how) that a new VC model needs to be “rigged”, i.e., FACS-based muscle attachment needs to be done manually for every new facial morphology. Two example models in Vizard format are provided in the tool. TIP: changing the texture of the head will already create a different look.

An overview of the process of creating a new faces with a different morphology can be found in the Facial emotion expression overview.pptx file in the package.
If you have any questions on how to rig the 3d model with FACS muscle attachements prior to using it for expression, please contact joost.broekens@gmail.com

Broekens, J., Chao Qu, & Brinkman, W.-P. (2012). Dynamic Facial Expression of Emotion Made Easy: Interactive Intelligence, Delft University of Technology.

Contact:

joost.broekens@gmail.com

Papers:

Broekens, J., Chao Qu, & Brinkman, W.-P. (2012). Dynamic Facial Expression of Emotion Made Easy: Interactive Intelligence, Delft University of Technology.

Access:

Free download from website.

Fact Sheet
Categorized by type:
Embodied Conversational Agent Other: facial expression
Powered by Plone

Portal usage statistics