Abstract
Perceiving specific emotions from others’ faces is a crucial ability for flexible and adaptive interaction, but the role of emotion concepts and categories in this perception has been controversial. The present study aims to investigate the precise time course of emotion concepts and categories involved in facial emotion perception. We conducted a behavioral task of conceptual similarity rating for emotional words, and emotion categorization tasks for emotional words and faces while recording electroencephalographic signals. We also performed a representational similarity analysis to assess the degree of correspondence between the representations of emotion concepts and categories with emotional faces over time. The results showed that the behavioral representations of emotion categories and concepts were successively correlated with the neural representations of the late processing stage for emotional faces at ~ 600–800 ms and ~800–1,000 ms, respectively. Furthermore, the representation of visual features of emotional faces was correlated with the neural representation of the early processing stage for faces (120–160 ms). Together, these results suggest that there is a temporal hierarchy in facial emotion perception that proceeds from visual to emotionally categorial to conceptual feature processing, providing electrophysiological evidence in support of basic emotion theory.
Authors
Guan, Y., Hao, S., Liu, S., Wu, Z., Schwieter, J. W., Liu, H., & He, W.
https://doi.org/10.1093/cercor/bhaf311