Automated Facial Expression Analysis

Facial muscle activity is highly specialized for expression, allowing us to share social information and communicate non-verbally. A small set of distinctive facial configurations have been found to be associated with distinct emotions across age, gender, and culture.
Automated facial expression analysis applies advanced computer vision and machine learning tools to categorize moment-to-moment changes in subjects’ emotions on the basis of small movements of more than 40 facial muscles. Using facial expression analysis in concert with eye-tracking enables our lab to better link visual attention to emotional response. The software will identify changes in micro-expressions (milliseconds in length) associated with discrete basic emotions, (e.g., joy, surprise, anger, sadness, fear, contempt, disgust), complex emotions (e.g., frustration and confusion), and valence (positive, negative, neutral), We can even track individual facial muscles.
Automated facial expression analysis applies advanced computer vision and machine learning tools to categorize moment-to-moment changes in subjects’ emotions on the basis of small movements of more than 40 facial muscles. Using facial expression analysis in concert with eye-tracking enables our lab to better link visual attention to emotional response. The software will identify changes in micro-expressions (milliseconds in length) associated with discrete basic emotions, (e.g., joy, surprise, anger, sadness, fear, contempt, disgust), complex emotions (e.g., frustration and confusion), and valence (positive, negative, neutral), We can even track individual facial muscles.