Refine
Has Fulltext
- yes (3)
Is part of the Bibliography
- yes (3)
Document Type
- Journal article (3)
Language
- English (3)
Keywords
- social anxiety (3) (remove)
Institute
Why are you looking like that? How the context influences evaluation and processing of human faces
(2013)
Perception and evaluation of facial expressions are known to be heavily modulated by emotional features of contextual information. Such contextual effects, however, might also be driven by non-emotional aspects of contextual information, an interaction of emotional and non-emotional factors, and by the observers’ inherent traits. Therefore, we sought to assess whether contextual information about self-reference in addition to information about valence influences the evaluation and neural processing of neutral faces. Furthermore, we investigated whether social anxiety moderates these effects. In the present functional magnetic resonance imaging (fMRI) study, participants viewed neutral facial expressions preceded by a contextual sentence conveying either positive or negative evaluations about the participant or about somebody else. Contextual influences were reflected in rating and fMRI measures, with strong effects of self-reference on brain activity in the medial prefrontal cortex and right fusiform gyrus. Additionally, social anxiety strongly affected the response to faces conveying negative, self-related evaluations as revealed by the participants’ rating patterns and brain activity in cortical midline structures and regions of interest in the left and right middle frontal gyrus. These results suggest that face perception and processing are highly individual processes influenced by emotional and non-emotional aspects of contextual information and further modulated by individual personality traits.
It has been demonstrated that verbal context information alters the neural processing of ambiguous faces such as faces with no apparent facial expression. In social anxiety, neutral faces may be implicitly threatening for socially anxious individuals due to their ambiguous nature, but even more so if these neutral faces are put in self-referential negative contexts. Therefore, we measured event-related brain potentials (ERPs) in response to neutral faces which were preceded by affective verbal information (negative, neutral, positive). Participants with low social anxiety (LSA; n = 23) and high social anxiety (HSA; n = 21) were asked to watch and rate valence and arousal of the respective faces while continuous EEG was recorded. ERP analysis revealed that HSA showed elevated P100 amplitudes in response to faces, but reduced structural encoding of faces as indexed by reduced N170 amplitudes. In general, affective context led to an enhanced early posterior negativity (EPN) for negative compared to neutral facial expressions. Moreover, HSA compared to LSA showed enhanced late positive potentials (LPP) to negatively contextualized faces, whereas in LSA this effect was found for faces in positive contexts. Also, HSA rated faces in negative contexts as more negative compared to LSA. These results point at enhanced vigilance for neutral faces regardless of context in HSA, while structural encoding seems to be diminished (avoidance). Interestingly, later components of sustained processing (LPP) indicate that LSA show enhanced visuocortical processing for faces in positive contexts (happy bias), whereas this seems to be the case for negatively contextualized faces in HSA (threat bias). Finally, our results add further new evidence that top-down information in interaction with individual anxiety levels can influence early-stage aspects of visual perception.
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice's emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.