The fascination of reading faces and predicting behaviour or life events has persisted through ages, and remains a much-coveted ‘power’. However, the answer to this is probably less magical and more analytical in nature.
Interpretation of emotions displayed by others remains one the most realistic and attainable ‘superhuman’ skills. It is a vital communicative tool and an indispensable threat radar, helping individuals (and animals) perceive threats effectively for continued survival and safety. Although, some question the extent to which emotions are viscerally understood and interpreted across the globe, it is widely believed that the five basic emotions—happiness, sadness, anger, disgust, and fear—are universal in nature and are expressed, recognized and labelled in an indiscriminately equivalent manner in various diverse cultural settings and time periods. This is attributable to cultural antecedents which, over the centuries, are ingrained in us through social learning. For example, fear may be induced through an antecedent or a stimulus capable of causing physical or psychological harm, say darkness or snakes. Similarly, anger may be induced through an antecedent of betrayal or infidelity.
Cognitive empathy, a subset of the empathy construct which deals with understanding emotions and behaviour of others, is often associated with emotion recognition (ER) skills. Although it appears to be an integral component of Emotional Intelligence (EI) and Social Intelligence (SI), it isn’t necessarily considered imperative to the presence of emotional or affective empathy, a subset of the empathy construct which deals with experiencing others’ emotional states effectively. There are several factors that influence ER abilities in individuals, some of which are cultural specificity, affective physiological state, personality types, early life experiences, psychopathology, anxiety, and gender.
Paul Ekman, an influential psychologist known for his work in the area of emotions and facial microexpressions, along with his colleagues, developed a psychometric test called Facial Action Coding System (FACS) which systematically measures any facial expression that an individual is capable of making. FACS also doubles up as a training module for students and/or professionals interested in learning and developing ER skills. Apart from FACS, Geneva Emotion Recognition Test (GERT) and Multimodal Emotion Recognition Test (MERT) are strong performance-based tests that measure ER skills through visual, auditory, pictorial (MERT) and audio-visual (MERT and GERT) modes of transmission.
In the recent past, applications of ER have expanded tremendously, ranging from enhancement of human-computer interaction and simplification of software usability, to augmentation of public safety measures. It has also extended to military healthcare by providing support to military personnel dealing with psychological issues like PTSD, depression and suicide. Moreover, a large section of people have successfully transformed their ER skills (along with other observational skills) into a spiritual business of psychic reading and face reading.
Apart from being a topic widely studied by researchers, it has also intrigued the film industry where writers and directors have explored the possible applications of ER through TV shows like Lie to Me and The Mentalist that extend ER skills into law enforcement by using deception detection and reading facial macro and micro expressions for crime-solving and offender profiling. Similarly, movies like Her and X Machina explore a futuristic (and possibly dystopic) view where human-computer interaction is extrapolated to the next level through artificial intelligence and invention of conscious and self-aware ‘robots’ possessing affective and cognitive empathy-like skills, capable of experiencing and comprehending emotions of self and others.
[Note: Monk Prayogshala is currently conducting research in the area of emotion recognition and its relationship with personality types. If you wish to participate and contribute to scientific research, click here!]