Refine
Has Fulltext
- yes (18)
Is part of the Bibliography
- yes (18)
Year of publication
Document Type
- Journal article (11)
- Doctoral Thesis (5)
- Review (1)
- Working Paper (1)
Language
- English (18) (remove)
Keywords
- emotion (18) (remove)
Institute
- Institut für Psychologie (12)
- Klinik und Poliklinik für Psychiatrie, Psychosomatik und Psychotherapie (3)
- Graduate School of Life Sciences (2)
- Institut für Evangelische Theologie und Religionspädagogik (1)
- Institut für Pädagogik (1)
- Lehrstuhl für Molekulare Psychiatrie (1)
- Neurologische Klinik und Poliklinik (1)
Faces in context: A review and systematization of contextual influences on affective face processing
(2012)
Facial expressions are of eminent importance for social interaction as they convey information about other individuals’ emotions and social intentions. According to the predominant “basic emotion” approach, the perception of emotion in faces is based on the rapid, auto- matic categorization of prototypical, universal expressions. Consequently, the perception of facial expressions has typically been investigated using isolated, de-contextualized, static pictures of facial expressions that maximize the distinction between categories. However, in everyday life, an individual’s face is not perceived in isolation, but almost always appears within a situational context, which may arise from other people, the physical environment surrounding the face, as well as multichannel information from the sender. Furthermore, situational context may be provided by the perceiver, including already present social infor- mation gained from affective learning and implicit processing biases such as race bias.Thus, the perception of facial expressions is presumably always influenced by contextual vari- ables. In this comprehensive review, we aim at (1) systematizing the contextual variables that may influence the perception of facial expressions and (2) summarizing experimental paradigms and findings that have been used to investigate these influences. The studies reviewed here demonstrate that perception and neural processing of facial expressions are substantially modified by contextual information, including verbal, visual, and auditory information presented together with the face as well as knowledge or processing biases already present in the observer. These findings further challenge the assumption of auto- matic, hardwired categorical emotion extraction mechanisms predicted by basic emotion theories. Taking into account a recent model on face processing, we discuss where and when these different contextual influences may take place, thus outlining potential avenues in future research.
The perception of unpleasant stimuli enhances whereas the perception of pleasant stimuli decreases pain perception. In contrast, the effects of pain on the processing of emotional stimuli are much less known. Especially given the recent interest in facial expressions of pain as a special category of emotional stimuli, a main topic in this research line is the mutual influence of pain and facial expression processing. Therefore, in this mini-review we selectively summarize research on the effects of emotional stimuli on pain, but more extensively turn to the opposite direction namely how pain influences concurrent processing of affective stimuli such as facial expressions. Based on the motivational priming theory one may hypothesize that the perception of pain enhances the processing of unpleasant stimuli and decreases the processing of pleasant stimuli. This review reveals that the literature is only partly consistent with this assumption: pain reduces the processing of pleasant pictures and happy facial expressions, but does not – or only partly – affect processing of unpleasant pictures. However, it was demonstrated that pain selectively enhances the processing of facial expressions if these are pain-related (i.e., facial expressions of pain). Extending a mere affective modulation theory, the latter results suggest pain-specific effects which may be explained by the perception-action model of empathy. Together, these results underscore the important mutual influence of pain and emotional face processing.
The emotion of surprise entails a complex of immediate responses, such as cognitive interruption, attention allocation to, and more systematic processing of the surprising stimulus. All these processes serve the ultimate function to increase processing depth and thus cognitively master the surprising stimulus. The present account introduces phasic negative affect as the underlying mechanism responsible for this switch in operating mode. Surprising stimuli are schema discrepant and thus entail cognitive disfluency, which elicits immediate negative affect. This affect in turn works like a phasic cognitive tuning switching the current processing mode from more automatic and heuristic to more systematic and reflective processing. Directly testing the initial elicitation of negative affect by surprising events, the present experiment presented high and low surprising neutral trivia statements to N = 28 participants while assessing their spontaneous facial expressions via facial electromyography. High compared to low surprising trivia elicited higher corrugator activity, indicative of negative affect and mental effort, while leaving zygomaticus (positive affect) and frontalis (cultural surprise expression) activity unaffected. Future research shall investigate the mediating role of negative affect in eliciting surprise-related outcomes.
Just do it! Guilt as a moral intuition to cooperate - A parallel constraint satisfaction approach
(2011)
After a long dominance of rational models of judgment and decision-making in moral psychology (e.g. Kohlberg, 1969) there is now a strong interest in how intuitions and emotions influence moral judgments and decisions (e.g. Greene, 2007; Haidt, 2001; Monin, Pizarro, & Beer, 2007). In the literature, the influence of emotions on moral decisions is explained by heuristic or non-compensatory information processing (e.g. Sinnott-Armstrong, Young, & Cushman, 2010; Sunstein, 2005; Tobler, Kalis, & Kalenscher, 2008). However, the process of emotion elicitation is ignored. Appraisal theories postulate that emotion elicitation is due to the incoherence (or discrepancy) of behavioral representations like goals and actions (Moors, 2009). Emotion elicitation and intuitive decision-making can be combined if both processes apply a connectionist information processing structure (e.g. Barnes & Thagard, 1996). The current work contrasts both perspectives of intuitive-emotional decision-making with respect to guilt and cooperation.
The effect of inherently threatening contexts on visuocortical engagement to conditioned threat
(2023)
Fear and anxiety are crucial for adaptive responding in life‐threatening situations. Whereas fear is a phasic response to an acute threat accompanied by selective attention, anxiety is characterized by a sustained feeling of apprehension and hypervigilance during situations of potential threat. In the current literature, fear and anxiety are usually considered mutually exclusive, with partially separated neural underpinnings. However, there is accumulating evidence that challenges this distinction between fear and anxiety, and simultaneous activation of fear and anxiety networks has been reported. Therefore, the current study experimentally tested potential interactions between fear and anxiety. Fifty‐two healthy participants completed a differential fear conditioning paradigm followed by a test phase in which the conditioned stimuli were presented in front of threatening or neutral contextual images. To capture defense system activation, we recorded subjective (threat, US‐expectancy), physiological (skin conductance, heart rate) and visuocortical (steady‐state visual evoked potentials) responses to the conditioned stimuli as a function of contextual threat. Results demonstrated successful fear conditioning in all measures. In addition, threat and US‐expectancy ratings, cardiac deceleration, and visuocortical activity were enhanced for fear cues presented in threatening compared with neutral contexts. These results are in line with an additive or interactive rather than an exclusive model of fear and anxiety, indicating facilitated defensive behavior to imminent danger in situations of potential threat.
In addition to bradykinesia and tremor, patients with Parkinson’s disease (PD) are known to exhibit non-motor symptoms such as apathy and hypomimia but also impulsivity in response to dopaminergic replacement therapy. Moreover, a plethora of studies observe differences in electrocortical and autonomic responses to both visual and acoustic affective stimuli in PD subjects compared to healthy controls. This suggests that the basal ganglia (BG), as well as the hyperdirect pathway and BG thalamocortical circuits, are involved in affective processing. Recent studies have shown valence and dopamine-dependent changes in synchronization in the subthalamic nucleus (STN) in PD patients during affective tasks. This thesis investigates the role of dopamine, valence, and laterality in STN electrophysiology by analyzing event-related potentials (ERP), synchronization, and inter-hemispheric STN connectivity. STN recordings were obtained from PD patients with chronically implanted electrodes for deep brain stimulation during a passive affective picture presentation task. The STN exhibited valence-dependent ERP latencies and lateralized ‘high beta’ (28–40 Hz) event-related desynchronization. This thesis also examines the role of dopamine, valence, and laterality on STN functional connectivity with the anterior cingulate cortex (ACC) and the amygdala. The activity of these limbic structures was reconstructed using simultaneously recorded electroencephalographic signals. While the STN was found to establish early coupling with both structures, STN-ACC coupling in the ‘alpha’ range (7–11 Hz) and uncoupling in the ‘low beta’ range (14–21 Hz) were lateralized. Lateralization was also observed at the level of synchrony in both reconstructed sources and for ACC ERP amplitude, whereas dopamine modulated ERP latency in the amygdala. These results may deepen our current understanding of the STN as a limbic node within larger emotional-motor networks in the brain.
Are there emotional reactions towards social robots? Could you love a robot? Or, put the other way round: Could you mistreat a robot, tear it apart and sell it? Media reports people honoring military robots with funerals, mourning the “death” of a robotic dog, and granting the humanoid robot Sophia citizenship. But how profound are these reactions? Three experiments take a closer look on emotional reactions towards social robots by investigating the subjective experience of people as well as the motor expressive level. Contexts of varying degrees of Human-Robot Interaction (HRI) sketch a nuanced picture of emotions towards social robots that encompass conscious as well as unconscious reactions. The findings advance the understanding of affective experiences in HRI. It also turns the initial question into: Can emotional reactions towards social robots even be avoided?
Pictorial stimuli can vary on many dimensions, several aspects of which are captured by the term ‘visual complexity.’ Visual complexity can be described as, “a picture of a few objects, colors, or structures would be less complex than a very colorful picture of many objects that is composed of several components.” Prior studies have reported a relationship between affect and visual complexity, where complex pictures are rated as more pleasant and arousing. However, a relationship in the opposite direction, an effect of affect on visual complexity, is also possible; emotional arousal and valence are known to influence selective attention and visual processing. In a series of experiments, we found that ratings of visual complexity correlated with affective ratings, and independently also with computational measures of visual complexity. These computational measures did not correlate with affect, suggesting that complexity ratings are separately related to distinct factors. We investigated the relationship between affect and ratings of visual complexity, finding an ‘arousal-complexity bias’ to be a robust phenomenon. Moreover, we found this bias could be attenuated when explicitly indicated but did not correlate with inter-individual difference measures of affective processing, and was largely unrelated to cognitive and eyetracking measures. Taken together, the arousal-complexity bias seems to be caused by a relationship between arousal and visual processing as it has been described for the greater vividness of arousing pictures. The described arousal-complexity bias is also of relevance from an experimental perspective because visual complexity is often considered a variable to control for when using pictorial stimuli.
Humans have the tendency to react with congruent facial expressions when looking at an emotional face. Interestingly, recent studies revealed that several situational moderators can modulate strength and direction of these reactions. In current literature, congruent facial reactions to emotional facial expressions are usually described in terms of “facial mimicry” and interpreted as imitative behavior. Thereby, facial mimicry is understood as a process of pure motor resonance resulting from overlapping representations for the perception and the execution of a certain behavior. Motor mimicry, however, is not the only mechanism by which congruent facial reactions can occur. Numerous studies have shown that facial muscles also indicate valence evaluations. Furthermore, facial reactions are also determined by our current emotional state. These thoughts suggest that the modulation of congruent facial reactions to emotional expressions can be based on both motor and affective processes. However, a separation of motor and affective processes in facial reactions is hard to make. None of the published studies that tried that could show a clear involvement of one or the other process so far. Therefore, the aim of the present line of experiments is to shed light on the involvement of motor and affective processes in the modulation of congruent and incongruent facial reactions. Specifically, the experiments are designed to test the assumptions of a working model on mechanisms underlying the modulation of facial reactions and to examine the neuronal correlates involved in such modulations with a broad range of methods. Experiments 1 and 2 experimentally manipulate motor and affective mechanisms by using specific contexts. In the chose settings, motor process models and affective models of valence evaluations make competing predictions about resulting facial reactions. The results of Experiment 1 did not support the involvement of valence evaluations in the modulation of congruent and incongruent facial reactions to facial expressions. The results of Experiments 2a and 2b suggest that emotional reactions are the predominant determinant of facial reactions. Experiment 3 aimed at identifying the psychological mediators that indicate motor and affective mechanisms. Motor mechanisms are assessed via the psychological mediator empathy. Additionally, as a psychological mediator for clarifying the role of affective mechanisms subjective measures of the participants’ current emotional state in response to the presented facial expressions were taken. Mediational analyses show that the modulation of congruent facial reactions can be explained by a decrease of state cognitive empathy. This suggests that motor processes mediate the effects of the context on congruent facial reactions. However, such a mechanism could not be observed for incongruent reactions. Instead, it was found that affective processes in terms of emotional reactions are involved in incongruent facial reactions. Additionally, the involvement of a third class of processes, namely strategic processes, was observed. Experiment 4 aimed at investigating whether a change in the strength of perception can explain the contextual modulation of facial reactions to facial expressions. According to motor process models the strength of perception is directly related to the strength of the spread of activation from perception to the execution of an action and thereby to the strength of the resulting mimicry behavior. The results suggest that motor mechanisms were involved in the modulation of congruent facial reactions by attitudes. Such an involvement of motor mechanisms could, however, not be observed for the modulation of incongruent reactions. In Experiment 5 the investigation of neuronal correlates shall be extended to the observation of involved brain areas via fMRI. The proposed brain areas depicting motor areas were prominent parts of the mirror neuron system. The regions of interest depicting areas involved in the affective processing were amygdala, insula, striatum. Furthermore, it could be shown that changes in the activity of parts of the MNS are related to the modulation of congruent facial reactions. Further on, results revealed the involvement of affective processes in the modulation of incongruent facial reactions. In sum, these results lead to a revised working model on the mechanisms underlying the modulation of facial reactions to emotional facial expressions. The results of the five experiments provide strong support for the involvement of motor mechanisms in congruent facial reactions. No evidence was found for the involvement of motor mechanisms in the occurrence or modulation of incongruent facial reactions. Furthermore, no evidence was found for the involvement of valence evaluations in the modulation of facial reactions. Instead, emotional reactions were found to be involved in the modulation of mainly incongruent facial reactions.
The most prominent brain region evaluating the significance of external stimuli immediately after their onset is the amygdala. Stimuli evaluated as being stressful actuate a number of physiological processes as an immediate stress response. Variation in the serotonin transporter gene has been associated with increased anxiety- and depression-like behavior, altered stress reactivity and adaptation, and pathophysiology of stress-related disorders. In this study the instant reactions to an acute stressor were measured in a serotonin transporter knockout mouse model. Mice lacking the serotonin transporter were verified to be more anxious than their wild-type conspecifics. Genome-wide gene expression changes in the amygdala were measured after the mice were subjected to control condition or to an acute stressor of one minute exposure to water. The dissection of amygdalae and stabilization of RNA was conducted within nine minutes after the onset of the stressor. This extremely short protocol allowed for analysis of first wave primary response genes, typically induced within five to ten minutes of stimulation, and was performed using Affymetrix GeneChip Mouse Gene 1.0 ST Arrays. RNA profiling revealed a largely new set of differentially expressed primary response genes between the conditions acute stress and control that differed distinctly between wild-type and knockout mice. Consequently, functional categorization and pathway analysis indicated genes related to neuroplasticity and adaptation in wild-types whereas knockouts were characterized by impaired plasticity and genes more related to chronic stress and pathophysiology. Our study therefore disclosed different coping styles dependent on serotonin transporter genotype even directly after the onset of stress and accentuates the role of the serotonergic system in processing stressors and threat in the amygdala. Moreover, several of the first wave primary response genes that we found might provide promising targets for future therapeutic interventions of stress-related disorders also in humans.