Refine
Has Fulltext
- yes (8)
Is part of the Bibliography
- yes (8)
Document Type
- Journal article (8) (remove)
Language
- English (8) (remove)
Keywords
- perception (8) (remove)
Institute
- Institut für Psychologie (8) (remove)
Virtual reality (VR) has made its way into mainstream psychological research in the last two decades. This technology, with its unique ability to simulate complex, real situations and contexts, offers researchers unprecedented opportunities to investigate human behavior in well controlled designs in the laboratory. One important application of VR is the investigation of pathological processes in mental disorders, especially anxiety disorders. Research on the processes underlying threat perception, fear, and exposure therapy has shed light on more general aspects of the relation between perception and emotion. Being by its nature virtual, i.e., simulation of reality, VR strongly relies on the adequate selection of specific perceptual cues to activate emotions. Emotional experiences in turn are related to presence, another important concept in VR, which describes the user's sense of being in a VR environment. This paper summarizes current research into perception of fear cues, emotion, and presence, aiming at the identification of the most relevant aspects of emotional experience in VR and their mutual relations. A special focus lies on a series of recent experiments designed to test the relative contribution of perception and conceptual information on fear in VR. This strand of research capitalizes on the dissociation between perception (bottom up input) and conceptual information (top-down input) that is possible in VR. Further, we review the factors that have so far been recognized to influence presence, with emotions (e.g., fear) being the most relevant in the context of clinical psychology. Recent research has highlighted the mutual influence of presence and fear in VR, but has also traced the limits of our current understanding of this relationship. In this paper, the crucial role of perception on eliciting emotional reactions is highlighted, and the role of arousal as a basic dimension of emotional experience is discussed. An interoceptive attribution model of presence is suggested as a first step toward an integrative framework for emotion research in VR. Gaps in the current literature and future directions are outlined.
Alerting signals often serve to reduce temporal uncertainty by predicting the time of stimulus onset. The resulting response time benefits have often been explained by facilitated translation of stimulus codes into response codes on the basis of established stimulus-response (S-R) links. In paradigms of masked S-R priming alerting signals also modulate response activation processes triggered by subliminally presented prime stimuli. In the present study we tested whether facilitation of visuo-motor translation processes due to alerting signals critically depends on established S-R links. Alerting signals resulted in significantly enhanced masked priming effects for masked prime stimuli that included and that did not include established S-R links fi.e., target vs. novel primes). Yet, the alerting-priming interaction was more pronounced for target than for novel primes. These results suggest that effects of alerting signals on masked priming are especially evident when S-R links between prime and target exist. At the same time, an alerting-priming interaction also for novel primes suggests that alerting signals also facilitate stimulus-response translation processes when masked prime stimuli provide action-trigger conditions in terms of programmed S-R links.
Embodiment (i.e., the involvement of a bodily representation) is thought to be relevant in emotional experiences. Virtual reality (VR) is a capable means of activating phobic fear in patients. The representation of the patient’s body (e.g., the right hand) in VR enhances immersion and increases presence, but its effect on phobic fear is still unknown. We analyzed the influence of the presentation of the participant’s hand in VR on presence and fear responses in 32 women with spider phobia and 32 matched controls. Participants sat in front of a table with an acrylic glass container within reaching distance. During the experiment this setup was concealed by a head-mounted display (HMD). The VR scenario presented via HMD showed the same setup, i.e., a table with an acrylic glass container. Participants were randomly assigned to one of two experimental groups. In one group, fear responses were triggered by fear-relevant visual input in VR (virtual spider in the virtual acrylic glass container), while information about a real but unseen neutral control animal (living snake in the acrylic glass container) was given. The second group received fear-relevant information of the real but unseen situation (living spider in the acrylic glass container), but visual input was kept neutral VR (virtual snake in the virtual acrylic glass container). Participants were instructed to touch the acrylic glass container with their right hand in 20 consecutive trials. Visibility of the hand was varied randomly in a within-subjects design. We found for all participants that visibility of the participant’s hand increased presence independently of the fear trigger. However, in patients, the influence of the virtual hand on fear depended on the fear trigger. When fear was triggered perceptually, i.e., by a virtual spider, the virtual hand increased fear. When fear was triggered by information about a real spider, the virtual hand had no effect on fear. Our results shed light on the significance of different fear triggers (visual, conceptual) in interaction with body representations.
Face processing can be explored using electrophysiological methods. Research with event-related potentials has demonstrated the so-called face inversion effect, in which the N170 component is enhanced in amplitude and latency to inverted, compared to upright, faces. The present study explored the extent to which repetitive lower-level visual cortical engagement, reflected in flicker steady-state visual evoked potentials (ssVEPs), shows similar amplitude enhancement to face inversion. We also asked if inversion-related ssVEP modulation would be dependent on the stimulation rate at which upright and inverted faces were flickered. To this end, multiple tagging frequencies were used (5, 10, 15, and 20 Hz) across two studies (n=21, n=18). Results showed that amplitude enhancement of the ssVEP for inverted faces was found solely at higher stimulation frequencies (15 and 20 Hz). By contrast, lower frequency ssVEPs did not show this inversion effect. These findings suggest that stimulation frequency affects the sensitivity of ssVEPs to face inversion.
Several emotion theorists suggest that valenced stimuli automatically trigger motivational orientations and thereby facilitate corresponding behavior. Positive stimuli were thought to activate approach motivational circuits which in turn primed approach-related behavioral tendencies whereas negative stimuli were supposed to activate avoidance motivational circuits so that avoidance-related behavioral tendencies were primed (motivational orientation account). However, recent research suggests that typically observed affective stimulus response compatibility phenomena might be entirely explained in terms of theories accounting for mechanisms of general action control instead of assuming motivational orientations to mediate the effects (evaluative coding account). In what follows, we explore to what extent this notion is applicable. We present literature suggesting that evaluative coding mechanisms indeed influence a wide variety of affective stimulus response compatibility phenomena. However, the evaluative coding account does not seem to be sufficient to explain affective S-R compatibility effects. Instead, several studies provide clear evidence in favor of the motivational orientation account that seems to operate independently of evaluative coding mechanisms. Implications for theoretical developments and future research designs are discussed.
Changes in body perception often arise when observers are confronted with related yet discrepant multisensory signals. Some of these effects are interpreted as outcomes of sensory integration of various signals, whereas related biases are ascribed to learning-dependent recalibration of coding individual signals. The present study explored whether the same sensorimotor experience entails changes in body perception that are indicative of multisensory integration and those that indicate recalibration. Participants enclosed visual objects by a pair of visual cursors controlled by finger movements. Then either they judged their perceived finger posture (indicating multisensory integration) or they produced a certain finger posture (indicating recalibration). An experimental variation of the size of the visual object resulted in systematic and opposite biases of the perceived and produced finger distances. This pattern of results is consistent with the assumption that multisensory integration and recalibration had a common origin in the task we used.
It has been argued that several reported non-visual influences on perception cannot be truly perceptual. If they were, they should affect the perception of target objects and reference objects used to express perceptual judgments, and thus cancel each other out. This reasoning presumes that non-visual manipulations impact target objects and comparison objects equally. In the present study we show that equalizing a body-related manipulation between target objects and reference objects essentially abolishes the impact of that manipulation so as it should do when that manipulation actually altered perception. Moreover, the manipulation has an impact on judgements when applied to only the target object but not to the reference object, and that impact reverses when only applied to the reference object but not to the target object. A perceptual explanation predicts this reversal, whereas explanations in terms of post-perceptual response biases or demand effects do not. Altogether these results suggest that body-related influences on perception cannot as a whole be attributed to extra-perceptual factors.
The present study explored the origin of perceptual changes repeatedly observed in the context of actions. In Experiment 1, participants tried to hit a circular target with a stylus movement under restricted feedback conditions. We measured the perception of target size during action planning and observed larger estimates for larger movement distances. In Experiment 2, we then tested the hypothesis that this action specific influence on perception is due to changes in the allocation of spatial attention. For this purpose, we replaced the hitting task by conditions of focused and distributed attention and measured the perception of the former target stimulus. The results revealed changes in the perceived stimulus size very similar to those observed in Experiment 1. These results indicate that action's effects on perception root in changes of spatial attention.