Today,
I made a little experiment to validate videos of facial effect. Your
institute probably has the Ekman
faces CDROM, right? If you do, you can verify it has over a hundred pictures of
very retro looking faces of people
including Paul Ekman himself (if younger), enacting six different emotions – happy,
angry, sad, disgusted, surprised, fearful – and a neutral baseline (sometimes).
You could argue that enough is enough, these pictures are too old, or perhaps
that emotions in the wild are rarely
statically displayed. My reason was less obvious: for a project on the
affective and cognitive effects of touch, somehow
we ended up with this giant construct of a study which includes a bar in virtual reality, where one
meets a 3D model displaying some emotion, then touches the participant, whose
EEG we measure. I promise you it honestly makes sense and is great fun to do,
but in the mean time, it’s clear that we need to validate animated (as opposed
to static) faces to know in advance that the emotions are recognized consistently.
Friday, 21 March 2014
Tutorial: Validating Videos of Facial Affect
Welcome to our blog
The authors welcome comments and feedback regarding The E-Primer.
Occasionally we may post tutorials, news and random ideas here.
Feel free to give your feedback here?!
Occasionally we may post tutorials, news and random ideas here.
Feel free to give your feedback here?!
Subscribe to:
Posts (Atom)