Evaluation Study of the AFFDEX and FACET algorithms
Branch: master
Clone or download
Fetching latest commit…
Cannot retrieve the latest commit at this time.
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
Study 1
Study 2
.gitignore
LICENSE
README.md

README.md

FacialExpressionAnalysis

Facial Expression Analysis with AFFDEX and FACET: A Validation Study

The goal of this study was to validate AFFDEX and FACET, two software based algorithms to analyze emotional facial expressions. In Study 1, pictures of standardized emotional facial expressions from three databases, Warsaw Set of Emotional Facial Expression Pictures (WSEFEP), Amsterdam Dynamic Facial Expression Set (ADFES) and Radboud Faces Database (RaFD), were classified with both modules. Results show large variance in accuracy across emotions and databases with a performance advantage for FACET over AFFDEX. In Study 2, 110 respondents’ facial responses were measured while being exposed to emotionally evocative pictures from the International Affective Picture System (IAPS), Geneva Affective Picture Database (GAPED) and Radboud Faces Database (RaFD). Accuracy again differs for distinct emotions and results in better performance for FACET. Overall, iMotions can achieve acceptable accuracy for standardized pictures of prototypical (vs. natural) facial expressions, but only weak accuracy for more natural facial expressions. We discuss potential sources for limited validity and suggest research directions in the broader context of emotion research.

General Information

This repository contains the necessary code and data to replicate the study.

Citation

To be filled.

Replication

In order to replicate the findings of this validation study, each folder (Study 1 & Study 2) must be treated separately with the R working directory set to respective R folder. In the R folder, run each of the numbered scripts in order. After execution the respective results are located in Study {1,2}/data/output.

Acknowledgments

We thank Adem Halimi and Elena von Wyttenbach for their help with the data collection.