Facial expressions can be categorized along the upper-lower facial axis, from a perceptual perspective. 2021

Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
Department of Psychology, Normal College, Shihezi University, Xinjiang, China.

A critical question, fundamental for building models of emotion, is how to categorize emotions. Previous studies have typically taken one of two approaches: (a) they focused on the pre-perceptual visual cues, how salient facial features or configurations were displayed; or (b) they focused on the post-perceptual affective experiences, how emotions affected behavior. In this study, we attempted to group emotions at a peri-perceptual processing level: it is well known that humans perceive different facial expressions differently, therefore, can we classify facial expressions into distinct categories in terms of their perceptual similarities? Here, using a novel non-lexical paradigm, we assessed the perceptual dissimilarities between 20 facial expressions using reaction times. Multidimensional-scaling analysis revealed that facial expressions were organized predominantly along the upper-lower face axis. Cluster analysis of behavioral data delineated three superordinate categories, and eye-tracking measurements validated these clustering results. Interestingly, these superordinate categories can be conceptualized according to how facial displays interact with acoustic communications: One group comprises expressions that have salient mouth features. They likely link to species-specific vocalization, for example, crying, laughing. The second group comprises visual displays with diagnosing features in both the mouth and the eye regions. They are not directly articulable but can be expressed prosodically, for example, sad, angry. Expressions in the third group are also whole-face expressions but are completely independent of vocalization, and likely being blends of two or more elementary expressions. We propose a theoretical framework to interpret the tripartite division in which distinct expression subsets are interpreted as successive phases in an evolutionary chain.

UI MeSH Term Description Entries
D010775 Photic Stimulation Investigative technique commonly used during ELECTROENCEPHALOGRAPHY in which a series of bright light flashes or visual patterns are used to elicit brain activity. Stimulation, Photic,Visual Stimulation,Photic Stimulations,Stimulation, Visual,Stimulations, Photic,Stimulations, Visual,Visual Stimulations
D011930 Reaction Time The time from the onset of a stimulus until a response is observed. Response Latency,Response Speed,Response Time,Latency, Response,Reaction Times,Response Latencies,Response Times,Speed, Response,Speeds, Response
D004644 Emotions Those affective states which can be experienced and have arousing and motivational properties. Feelings,Regret,Emotion,Feeling,Regrets
D005145 Face The anterior portion of the head that includes the skin, muscles, and structures of the forehead, eyes, nose, mouth, cheeks, and jaw. Faces
D005149 Facial Expression Observable changes of expression in the face in response to emotional stimuli. Face Expression,Expression, Face,Expression, Facial,Face Expressions,Facial Expressions
D006801 Humans Members of the species Homo sapiens. Homo sapiens,Man (Taxonomy),Human,Man, Modern,Modern Man

Related Publications

Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
October 2007, The Neuroscientist : a review journal bringing neurobiology, neurology and psychiatry,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
January 1982, The British journal of psychiatry : the journal of mental science,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
April 1980, JAMA,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
November 2016, Vision research,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
October 2012, Emotion (Washington, D.C.),
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
September 2016, Animal cognition,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
March 2018, Neuropsychologia,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
November 2023, Journal of Intelligence,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
April 2016, NeuroImage,
Chao Ma, and Nianxin Guo, and Faraday Davies, and Yantian Hou, and Suyan Guo, and Xun Zhu
June 2003, Brain and cognition,
Copied contents to your clipboard!