Filtern
Volltext vorhanden
- ja (2)
Gehört zur Bibliographie
- ja (2)
Dokumenttyp
Sprache
- Englisch (2)
Schlagworte
- social anxiety (2) (entfernen)
Institut
Why are you looking like that? How the context influences evaluation and processing of human faces
(2013)
Perception and evaluation of facial expressions are known to be heavily modulated by emotional features of contextual information. Such contextual effects, however, might also be driven by non-emotional aspects of contextual information, an interaction of emotional and non-emotional factors, and by the observers’ inherent traits. Therefore, we sought to assess whether contextual information about self-reference in addition to information about valence influences the evaluation and neural processing of neutral faces. Furthermore, we investigated whether social anxiety moderates these effects. In the present functional magnetic resonance imaging (fMRI) study, participants viewed neutral facial expressions preceded by a contextual sentence conveying either positive or negative evaluations about the participant or about somebody else. Contextual influences were reflected in rating and fMRI measures, with strong effects of self-reference on brain activity in the medial prefrontal cortex and right fusiform gyrus. Additionally, social anxiety strongly affected the response to faces conveying negative, self-related evaluations as revealed by the participants’ rating patterns and brain activity in cortical midline structures and regions of interest in the left and right middle frontal gyrus. These results suggest that face perception and processing are highly individual processes influenced by emotional and non-emotional aspects of contextual information and further modulated by individual personality traits.
In everyday life, multiple sensory channels jointly trigger emotional experiences and one channel may alter processing in another channel. For example, seeing an emotional facial expression and hearing the voice's emotional tone will jointly create the emotional experience. This example, where auditory and visual input is related to social communication, has gained considerable attention by researchers. However, interactions of visual and auditory emotional information are not limited to social communication but can extend to much broader contexts including human, animal, and environmental cues. In this article, we review current research on audiovisual emotion processing beyond face-voice stimuli to develop a broader perspective on multimodal interactions in emotion processing. We argue that current concepts of multimodality should be extended in considering an ecologically valid variety of stimuli in audiovisual emotion processing. Therefore, we provide an overview of studies in which emotional sounds and interactions with complex pictures of scenes were investigated. In addition to behavioral studies, we focus on neuroimaging, electro- and peripher-physiological findings. Furthermore, we integrate these findings and identify similarities or differences. We conclude with suggestions for future research.