Refine
Has Fulltext
- yes (7)
Is part of the Bibliography
- yes (7)
Document Type
Keywords
- Mimik (7) (remove)
No abstract available
Ausgangspunkt für diese Arbeit war die Diskrepanz zwischen der vielfach belegten Schwierigkeit schizophrener Patienten bei der Dekodierung emotionaler Gesichtsausdrücke und dem mangelhaften Wissen über die hierfür verantwortlichen Prozesse. In der Literatur der letzten Jahre gab es einige viel versprechende Ergebnisse, die nahe legten, dass mit dem Elektroenzephalogramm (EEG) sowohl die Verarbeitung von Gesichtern, als auch der Mimik messbar ist. Somit wäre das EEG eine geeignete Methode den Prozess der Emotionsdekodierung bei schizophrenen Patienten zu untersuchen. Diese Arbeit untersucht folgende zwei Hauptfragestellungen. Erstens, wie lassen sich die für die Verarbeitung von Gesichtern und das Erkennen von emotionalen Gesichtsausdrücken verantwortlichen kognitiven Prozesse mit Hilfe ereigniskorrelierter Potentiale des EEGs reliabel messen? Zweitens, sind diese Prozesse bei schizophrenen Patienten im Vergleich zu gesunden Probanden beeinträchtigt? Zur Klärung der ersten Fragestellung wurden drei Stichproben gesunder Personen untersucht. Es zeigte sich in allen drei Untersuchungen, dass sich die Verarbeitung von Gesichtern im Vergleich zu Kontrollreizen in einer negativen Komponente um 170 ms über temporalen Elektrodenpositionen widerspiegelt (Gesichterpeak, N170). Die N170 konnte mit dem Quellenlokalisationsprogramm LORETA unter anderem im Gyrus Fusiformis, der entsprechenden Hirnregion für die Gesichtsverarbeitung, lokalisiert werden. Für die Dekodierung emotionaler Gesichtsausdrücke konnten keine wiederholbaren Effekte nachgewiesen werden. Im Weiteren wurde die Gesichtsverarbeitung bei schizophrenen Patienten untersucht. 22 als schizophren diagnostizierte Patienten wurden mit einer nach dem Alter, dem Geschlecht und dem Bildungsstatus angepassten Kontrollgruppe verglichen. In dieser Auswertung deutete sich bei schizophrenen Patienten ein Defizit in den frühen Verarbeitungsschritten von Gesichtern an. Dieses Ergebnis wurde in dieser Art noch nicht gezeigt und reiht sich ein in Studien, die sowohl strukturelle Veränderungen in den für die Gesichtsverarbeitung wesentlichen Hirnregionen bei schizophrenen Patienten zeigen konnten als auch ein allgemeines Defizit früher visueller Verarbeitung nahe legen.
Mediators of Social Anxiety - External Social Threat-Cues vs. Self-Related Negative Cognitions
(2009)
Based on a review of models and empirical findings a working model is proposed, suggesting that self-related negative cognitions and biased processing of external social threat-cues are mediators of social anxiety. Hypotheses derived from this model were tested in three experiments. The first experiment examined whether levels of trait social anxiousness predicted fearful responding to external social threat-cues (angry vs. neutral and happy facial expressions) during social evaluation. Higher trait social anxiousness predisposes to an inward focus on one’s fear reaction to social threat. Using this strategy was expected to enhance fearful responding to angry facial expressions. A strategy of identifying with angry faces was expected to counteract fearful responding, but was expected to fail more often with increasing levels of trait social anxiousness. To examine these hypotheses, affective modulation of the startle eye-blink was assessed in forty-four undergraduate students. This measure served as a probe into the activation of brain structures involved in the automatic evaluation of environmental threat-cues. Trait and state anxiety as well as explicit emotional responding to the stimuli were assessed with questionnaires and ratings. Processing angry faces potentiated startle amplitudes as expected. Low arousal induced by the stimuli was a probable reason, why startle potentiation to happy faces emerged instead of attenuation. Trait social anxiousness and the cognitive strategies did not influence these effects. Yet, increased trait social anxiousness predicted decreased startle latency, indicating motor hyper-responsivity, which is part of the clinical representation of social anxiety disorder (SAD). Processing facial expressions and identifying with them disrupted this association. Previous studies support that similar strategies may enhance treatment of SAD. Individuals with SAD were expected to respond with increased arousal to external social threat-cues. Therefore, the second experiment examined whether nine individuals with SAD showed attentional (prepulse inhibition, PPI) or affective startle modulation to angry as compared to neutral and happy facial expressions. Corrugator supercilii activity was assessed as a behavioral indicator for effects of facial expressions. The remaining setup resembled the first experiment. Facial expressions did not modulate the startle reflex, but corrugator supercilii activity was sensitive to facial valence. However, the effects were not related to trait social anxiousness. Apparently, angry facial expressions do not act as phobic stimuli for individuals with SAD. The third experiment examined whether focusing on self-related negative cognitions or biased processing of external social threat-cues mediates relationships between trait social anxiety and anxious responding in a socially challenging situation. Inducing self-related negative cognitions vs. relaxation was expected to reveal a functional dependency on the supposed mediation in a multivariate assessment of criteria of the working model. Within this design, the impact of external social threat-cues (facial expressions and emotional words) was compared to control stimuli and context effects, using the startle paradigm. The findings provide first evidence for full statistical mediation of the associations between trait social anxiety and self-reported anxiety as well as parasympathetic withdrawal by self-related negative cognitions, when thirty-six undergraduate students anticipated public speaking. Apprehensive arousal, as indicated by increased skin conductance levels and heart rate, was present in all participants. Observer ratings of behavior during public speaking matched the self-rated quality of the performance. None of these measures were correlated with trait social anxiousness. Startle amplitude correlated with state and trait social anxiety, but was no mediator. Finally, there was no affective modulation of the startle amplitude by external social threat-cues. These studies advance both our current understanding of the factors that mediate social anxiety responses to situations and our knowledge of the physiological and anatomical mechanisms involved in social anxiety. Based on these findings a revised version of the working model on mediators of social anxiety is proposed in the hope it may aid further research for the ultimate goal of developing an empirically validated functional anatomical model of social anxiety.
Are there emotional reactions towards social robots? Could you love a robot? Or, put the other way round: Could you mistreat a robot, tear it apart and sell it? Media reports people honoring military robots with funerals, mourning the “death” of a robotic dog, and granting the humanoid robot Sophia citizenship. But how profound are these reactions? Three experiments take a closer look on emotional reactions towards social robots by investigating the subjective experience of people as well as the motor expressive level. Contexts of varying degrees of Human-Robot Interaction (HRI) sketch a nuanced picture of emotions towards social robots that encompass conscious as well as unconscious reactions. The findings advance the understanding of affective experiences in HRI. It also turns the initial question into: Can emotional reactions towards social robots even be avoided?
The aim of this project was to investigate whether reflex-like innate facial reactions to tastes and odors are altered in patients with eating disorders. Qualitatively different tastes and odors have been found to elicit specific facial expressions in newborns. This specificity in newborns is characterized by positive facial reactions in response to pleasant stimuli and by negative facial reactions in response to unpleasant stimuli. It is, however, unclear, whether these specific facial displays remain stable during ontogeny (1). Despite the fact that several studies had shown that taste-and odor-elicited facial reactions remain quite stable across a human’s life-span, the specificity of research questions, as well as different research methods, allow only limited comparisons between studies. Moreover, the gustofacial response patterns might be altered in pathological eating behavior (2). To date, however, the question of whether dysfunctional eating behavior might alter facial activity in response to tastes and odors has not been addressed. Furthermore, changes in facial activity might be linked to deficient inhibitory facial control (3). To investigate these three research questions, facial reactions in response to tastes and odors were assessed. Facial reactions were analyzed using the Facial Action Coding System (FACS, Ekman & Friesen, 1978; Ekman, Friesen, & Hager, 2002) and electromyography.
Humans have the tendency to react with congruent facial expressions when looking at an emotional face. Interestingly, recent studies revealed that several situational moderators can modulate strength and direction of these reactions. In current literature, congruent facial reactions to emotional facial expressions are usually described in terms of “facial mimicry” and interpreted as imitative behavior. Thereby, facial mimicry is understood as a process of pure motor resonance resulting from overlapping representations for the perception and the execution of a certain behavior. Motor mimicry, however, is not the only mechanism by which congruent facial reactions can occur. Numerous studies have shown that facial muscles also indicate valence evaluations. Furthermore, facial reactions are also determined by our current emotional state. These thoughts suggest that the modulation of congruent facial reactions to emotional expressions can be based on both motor and affective processes. However, a separation of motor and affective processes in facial reactions is hard to make. None of the published studies that tried that could show a clear involvement of one or the other process so far. Therefore, the aim of the present line of experiments is to shed light on the involvement of motor and affective processes in the modulation of congruent and incongruent facial reactions. Specifically, the experiments are designed to test the assumptions of a working model on mechanisms underlying the modulation of facial reactions and to examine the neuronal correlates involved in such modulations with a broad range of methods. Experiments 1 and 2 experimentally manipulate motor and affective mechanisms by using specific contexts. In the chose settings, motor process models and affective models of valence evaluations make competing predictions about resulting facial reactions. The results of Experiment 1 did not support the involvement of valence evaluations in the modulation of congruent and incongruent facial reactions to facial expressions. The results of Experiments 2a and 2b suggest that emotional reactions are the predominant determinant of facial reactions. Experiment 3 aimed at identifying the psychological mediators that indicate motor and affective mechanisms. Motor mechanisms are assessed via the psychological mediator empathy. Additionally, as a psychological mediator for clarifying the role of affective mechanisms subjective measures of the participants’ current emotional state in response to the presented facial expressions were taken. Mediational analyses show that the modulation of congruent facial reactions can be explained by a decrease of state cognitive empathy. This suggests that motor processes mediate the effects of the context on congruent facial reactions. However, such a mechanism could not be observed for incongruent reactions. Instead, it was found that affective processes in terms of emotional reactions are involved in incongruent facial reactions. Additionally, the involvement of a third class of processes, namely strategic processes, was observed. Experiment 4 aimed at investigating whether a change in the strength of perception can explain the contextual modulation of facial reactions to facial expressions. According to motor process models the strength of perception is directly related to the strength of the spread of activation from perception to the execution of an action and thereby to the strength of the resulting mimicry behavior. The results suggest that motor mechanisms were involved in the modulation of congruent facial reactions by attitudes. Such an involvement of motor mechanisms could, however, not be observed for the modulation of incongruent reactions. In Experiment 5 the investigation of neuronal correlates shall be extended to the observation of involved brain areas via fMRI. The proposed brain areas depicting motor areas were prominent parts of the mirror neuron system. The regions of interest depicting areas involved in the affective processing were amygdala, insula, striatum. Furthermore, it could be shown that changes in the activity of parts of the MNS are related to the modulation of congruent facial reactions. Further on, results revealed the involvement of affective processes in the modulation of incongruent facial reactions. In sum, these results lead to a revised working model on the mechanisms underlying the modulation of facial reactions to emotional facial expressions. The results of the five experiments provide strong support for the involvement of motor mechanisms in congruent facial reactions. No evidence was found for the involvement of motor mechanisms in the occurrence or modulation of incongruent facial reactions. Furthermore, no evidence was found for the involvement of valence evaluations in the modulation of facial reactions. Instead, emotional reactions were found to be involved in the modulation of mainly incongruent facial reactions.