Puneet Varma (Editor)

Facial coding

Updated on
Edit
Like
Comment
Share on FacebookTweet on TwitterShare on LinkedInShare on Reddit

Facial coding is the process of measuring human emotions through facial expressions. Emotions can be detected by computer algorithms for automatic emotion recognition that record facial expressions via webcam. This can be applied to better understanding of people’s reactions to visual stimuli.

Contents

History

In 1872. Charles Darwin published a book “The Expression of the Emotions in Man and Animals”. He compared numerous images of humans and animals in different emotional states and suggested that some basic emotions, like anger, fear and sadness, are universal and present across ethnicities, cultures and even species. According to Darwin facial expressions were inborn (not learned) and common to humans and animals (some human characteristics like clenching teeth in anger or tears in eyes when sad have animal origin).

Despite the Darwin’s theory, in 1950 the prevailing belief was that facial expressions were determined by cultural background and learning process.

In 1960s, Paul Ekman an American psychologist, set out to visit people from different nations (including an isolated indigenous tribe in Papua New Guinea) to study non-verbal behavior across cultures. His research showed that Darwin was right and that facial expressions and emotions are universal as people from diverse cultural background interpreted expressions in photos in similar way. Ekman’s work indicated existence of 7 basic emotions that are universally present: happiness, surprise, fear, anger, disgust, sadness and neutral.

In 1978 Ekman and Friesen updated Facial Action Coding System (FACS), originally developed by a Swedish anatomist Carl-Herman Hjortsjö. FACS is a tool for classification of all facial expressions that humans can make. Each component of facial movement is called an action unit (AU) and all facial expressions can be broken down to action units. Ekman and Friesen identified 46 different units to describe each facial movement.

Technology

The computer algorithm for facial coding extracts the main features of face (mouth, eyebrows etc.) and analyzes movement, shape and texture composition of these regions to identify facial action units. Therefore, it is possible to track tiny movements of facial muscles in individuals’ face and translate them into universal facial expressions that convey happiness, surprise, sadness, anger and others. An algorithm developed by EyeSee can detect the following seven emotions: happiness, surprise, puzzle, disgust, fear, sadness and neutral.

Outcomes

The results of facial coding provide insight into viewers’ spontaneous, unfiltered reactions to visual content, by recording and automatically analyzing facial expressions during the process. It enables moment-by-moment emotional and cognitive metrics. Facial expressions are tracked in real time using key points on the viewer’s face to recognize a rich array of both emotional and cognitive states such as enjoyment, attention and confusion. Many of the users’ responses are so quick and fleeting that viewers may not even remember them, let alone be able to objectively report about them.

Application

Progress in facial coding technology and its accessibility enabled application in the field of market research. It can be used to test marketing communication such as advertising, shopper and digital campaigns. Respondents are exposed to visual stimuli (TV Commercial, Animatic, Pre-roll, Website, DM etc.) while algorithm registers and records their facial expressions via their webcam. The obtained data can be analyzed and results indicate valence over time, engagement level, emotional peaks and possibilities for improvement. Some companies complete this type of research internally, while others engage private companies specialized in facial coding services such as Affectiva, Realeyes and EyeSee.

References

Facial coding Wikipedia