Our faces are one of the strongest indicators of our emotions, whether we’re laughing or crying, our emotions are there for all to see. Our faces reflect how we feel.
According to the facial feedback hypothesis , our emotions are intertwined with our facial expressions. For example, an individual who is forced to smile during a social event will gradually find the event more enjoyable.
Our facial expressions are controlled by 43 facial muscles, which themselves are controlled by 2 facial nerves.
Practitioners have been fascinated with the field of facial expression analysis (FEA) for more than 100 years. In 1969 Swedish anatomist Carl-Herman Hjortsjö created the first FEA system, which, in 1978, Paul Ekman and Wallace Friesen developed into the Facial Action Coding System (FACS), further updated in 2002.
FACS represents a fully standardised classification system of facial expressions based on anatomic features. It’s used by expert human coders who carefully examine face videos and describe all facial expressions as combinations of elementary components called action units (AUs).
Although FACS is non-intrusive, it is very laborious and expensive. It can take up to an hour and a half to classify a one-minute video.
Facial expressions can also be analysed with facial electromyography (fEMG), which uses electrodes attached to the face to track muscle movement.
This technique measures muscle activity by detecting and amplifying the tiny electrical impulses that are generated by muscle fibres when they contract.
It focuses mainly on two major muscle groups in the face, the corrugator supercilii group, which is associated with frowning, and the zygomaticus major muscle group, which is associated with smiling.
fEMG allows measurement of very subtle facial muscle activity; although it is intrusive, which heightens respondents’ awareness of being measured.
More recent developments have led to automatic facial expression analysis being carried out by computer software. Progress in computer vision and computational algorithms has made it possible to mimic the face reading skills of humans based on the tracking of subtle changes in facial features on a moment-by-moment basis.
Emotional AI first identifies a human face in real time or in an image or video. Computer vision algorithms identify key landmarks on the face – for example, the corners of the eyebrows, the tip of the nose, the corners of the mouth. Deep learning algorithms then analyse pixels in those regions to classify facial expressions.
Combinations of these facial expressions are then mapped to emotions.
In psychology, an emotion is defined as a complex state of feeling that results in physical and psychological changes that influence thought and behaviour.
Psychologist Paul Ekman determined that there are six core emotions, which he termed universal emotions. They are:
He later added a seventh emotion that is sometimes considered universal:
These emotions can then be displayed visually in accordance with what the respondent is interacting with at that immediate time.
In this example, the in-market respondent, struggling to find a ‘red fabric sofa’ – experiences negative emotions of anger, surprise, sadness and disgust.
In this example, the respondent struggling to find how to compare his ‘favourites’ experiences emotions of anger, surprise, sadness and disgust.
A limitation of facial expression analysis is its inability to assess emotional arousal (intensity of emotion). To get both the valence (quality) of the emotional response as well as the amount of associated arousal (intensity), FEA needs to be used in conjunction with eye tracking, galvanic skin response (GSR) and electroencephalography (EEG).
Collecting synchronised data from multiple biometric sources adds even more to the picture as every sensor contributes a valuable aspect of emotional response you can’t get any other way.
Biometrics are redefining CRO and what it means to optimise. Those pioneering e-commerce brands that are already embracing it are gaining a distinct advantage over their competitors. By focusing on creating an emotionally enhanced customer experience they are getting more ‘new’ and ‘repeat’ customers. And even more important, they are building Customer Loyalty.
External sources/references:
Facial Feedback Hypothesis
https://www.affectiva.com/
Paul Ekman Group
iMotions
Carl-Herman Hjortsjo
https://www.kairos.com/