Refine
Has Fulltext
- yes (47)
Year of publication
Document Type
- Journal article (37)
- Doctoral Thesis (7)
- Bachelor Thesis (1)
- Conference Proceeding (1)
- Preprint (1)
Language
- English (47) (remove)
Keywords
- virtual reality (47) (remove)
Institute
- Institut für Informatik (19)
- Institut für Psychologie (19)
- Institut Mensch - Computer - Medien (8)
- Graduate School of Life Sciences (5)
- Institut für Allgemeinmedizin (2)
- Institut für Psychologie (bis Sept. 2007) (1)
- Institut für Pädagogik (1)
- Institut für Sportwissenschaft (1)
- Klinik und Poliklinik für Anästhesiologie (ab 2004) (1)
- Medizinische Fakultät (1)
Sonstige beteiligte Institutionen
Obesity is a serious disease that can affect both physical and psychological well-being. Due to weight stigmatization, many affected individuals suffer from body image disturbances whereby they perceive their body in a distorted way, evaluate it negatively, or neglect it. Beyond established interventions such as mirror exposure, recent advancements aim to complement body image treatments by the embodiment of visually altered virtual bodies in virtual reality (VR). We present a high-fidelity prototype of an advanced VR system that allows users to embody a rapidly generated personalized, photorealistic avatar and to realistically modulate its body weight in real-time within a carefully designed virtual environment. In a formative multi-method approach, a total of 12 participants rated the general user experience (UX) of our system during body scan and VR experience using semi-structured qualitative interviews and multiple quantitative UX measures. Using body weight modification tasks, we further compared three different interaction methods for real-time body weight modification and measured our system’s impact on the body image relevant measures body awareness and body weight perception. From the feedback received, demonstrating an already solid UX of our overall system and providing constructive input for further improvement, we derived a set of design guidelines to guide future development and evaluation processes of systems supporting body image interventions.
Mindfulness is considered an important factor of an individual's subjective well-being. Consequently, Human-Computer Interaction (HCI) has investigated approaches that strengthen mindfulness, i.e., by inventing multimedia technologies to support mindfulness meditation. These approaches often use smartphones, tablets, or consumer-grade desktop systems to allow everyday usage in users' private lives or in the scope of organized therapies. Virtual, Augmented, and Mixed Reality (VR, AR, MR; in short: XR) significantly extend the design space for such approaches. XR covers a wide range of potential sensory stimulation, perceptive and cognitive manipulations, content presentation, interaction, and agency. These facilities are linked to typical XR-specific perceptions that are conceptually closely related to mindfulness research, such as (virtual) presence and (virtual) embodiment. However, a successful exploitation of XR that strengthens mindfulness requires a systematic analysis of the potential interrelation and influencing mechanisms between XR technology, its properties, factors, and phenomena and existing models and theories of the construct of mindfulness. This article reports such a systematic analysis of XR-related research from HCI and life sciences to determine the extent to which existing research frameworks on HCI and mindfulness can be applied to XR technologies, the potential of XR technologies to support mindfulness, and open research gaps. Fifty papers of ACM Digital Library and National Institutes of Health's National Library of Medicine (PubMed) with and without empirical efficacy evaluation were included in our analysis. The results reveal that at the current time, empirical research on XR-based mindfulness support mainly focuses on therapy and therapeutic outcomes. Furthermore, most of the currently investigated XR-supported mindfulness interactions are limited to vocally guided meditations within nature-inspired virtual environments. While an analysis of empirical research on those systems did not reveal differences in mindfulness compared to non-mediated mindfulness practices, various design proposals illustrate that XR has the potential to provide interactive and body-based innovations for mindfulness practice. We propose a structured approach for future work to specify and further explore the potential of XR as mindfulness-support. The resulting framework provides design guidelines for XR-based mindfulness support based on the elements and psychological mechanisms of XR interactions.
Virtual reality (VR) has made its way into mainstream psychological research in the last two decades. This technology, with its unique ability to simulate complex, real situations and contexts, offers researchers unprecedented opportunities to investigate human behavior in well controlled designs in the laboratory. One important application of VR is the investigation of pathological processes in mental disorders, especially anxiety disorders. Research on the processes underlying threat perception, fear, and exposure therapy has shed light on more general aspects of the relation between perception and emotion. Being by its nature virtual, i.e., simulation of reality, VR strongly relies on the adequate selection of specific perceptual cues to activate emotions. Emotional experiences in turn are related to presence, another important concept in VR, which describes the user's sense of being in a VR environment. This paper summarizes current research into perception of fear cues, emotion, and presence, aiming at the identification of the most relevant aspects of emotional experience in VR and their mutual relations. A special focus lies on a series of recent experiments designed to test the relative contribution of perception and conceptual information on fear in VR. This strand of research capitalizes on the dissociation between perception (bottom up input) and conceptual information (top-down input) that is possible in VR. Further, we review the factors that have so far been recognized to influence presence, with emotions (e.g., fear) being the most relevant in the context of clinical psychology. Recent research has highlighted the mutual influence of presence and fear in VR, but has also traced the limits of our current understanding of this relationship. In this paper, the crucial role of perception on eliciting emotional reactions is highlighted, and the role of arousal as a basic dimension of emotional experience is discussed. An interoceptive attribution model of presence is suggested as a first step toward an integrative framework for emotion research in VR. Gaps in the current literature and future directions are outlined.
This thesis deals with the first part of a larger project that follows the ultimate goal of implementing a software tool that creates a Mission Control Room in Virtual Reality. The software is to be used for the operation of spacecrafts and is specially developed for the unique real-time requirements of unmanned satellite missions. Beginning from launch, throughout the whole mission up to the recovery or disposal of the satellite, all systems need to be monitored and controlled in continuous intervals, to ensure the mission’s success. Mission Operation is an essential part of every space mission and has been undertaken for decades. Recent technological advancements in the realm of immersive technologies pave the way for innovative methods to operate spacecrafts. Virtual Reality has the capability to resolve the physical constraints set by traditional Mission Control Rooms and thereby delivers novel opportunities. The paper highlights underlying theoretical aspects of Virtual Reality, Mission Control and IP Communication. However, the focus lies upon the practical part of this thesis which revolves around the first steps of the implementation of the virtual Mission Control Room in the Unity Game Engine. Overall, this paper serves as a demonstration of Virtual Reality technology and shows its possibilities with respect to the operation of spacecrafts.
With the rise of immersive media, advertisers have started to use 360° commercials to engage and persuade consumers. Two experiments were conducted to address research gaps and to validate the positive impact of 360° commercials in realistic settings. The first study (N = 62) compared the effects of 360° commercials using either a mobile cardboard head-mounted display (HMD) or a laptop. This experiment was conducted in the participants’ living rooms and incorporated individual feelings of cybersickness as a moderator. The participants who experienced the 360° commercial with the HMD reported higher spatial presence and product evaluation, but their purchase intentions were only increased when their reported cybersickness was low. The second experiment (N = 197) was conducted online and analyzed the impact of 360° commercials that were experienced with mobile (smartphone/tablet) or static (laptop/desktop) devices instead of HMDs. The positive effects of omnidirectional videos were stronger when participants used mobile devices.
Interpreting blood gas analysis results can be challenging for the clinician, especially in stressful situations under time pressure. To foster fast and correct interpretation of blood gas results, we developed Visual Blood. This computer-based, multicentre, noninferiority study compared Visual Blood and conventional arterial blood gas (ABG) printouts. We presented six scenarios to anaesthesiologists, once with Visual Blood and once with the conventional ABG printout. The primary outcome was ABG parameter perception. The secondary outcomes included correct clinical diagnoses, perceived diagnostic confidence, and perceived workload. To analyse the results, we used mixed models and matched odds ratios. Analysing 300 within-subject cases, we showed noninferiority of Visual Blood compared to ABG printouts concerning the rate of correctly perceived ABG parameters (rate ratio, 0.96; 95% CI, 0.92–1.00; p = 0.06). Additionally, the study revealed two times higher odds of making the correct clinical diagnosis using Visual Blood (OR, 2.16; 95% CI, 1.42–3.29; p < 0.001) than using ABG printouts. There was no or, respectively, weak evidence for a difference in diagnostic confidence (OR, 0.84; 95% CI, 0.58–1.21; p = 0.34) and perceived workload (Coefficient, 2.44; 95% CI, −0.09–4.98; p = 0.06). This study showed that participants did not perceive the ABG parameters better, but using Visual Blood resulted in more correct clinical diagnoses than using conventional ABG printouts. This suggests that Visual Blood allows for a higher level of situation awareness beyond individual parameters’ perception. However, the study also highlighted the limitations of today’s virtual reality headsets and Visual Blood.
Attention-Deficit/Hyperactivity Disorder (ADHD) is characterized by symptoms of inattentiveness and hyperactivity/impulsivity. Besides, increasing evidence points to ADHD patients showing emotional dysfunctions and concomitant problems in social life. However, systematic research on emotional dysfunctions in ADHD is still rare, and to date most studies lack conceptual differentiation between emotion processing and emotion regulation. The aim of this thesis was to systematically investigate emotion processing and emotion regulation in adult ADHD in a virtual reality paradigm implementing social interaction. Emotional reactions were assessed on experiential, physiological, and behavioral levels.
Experiment 1 was conducted to develop a virtual penalty kicking paradigm implying social feedback and to test it in a healthy sample. This paradigm should then be applied in ADHD patients later on. Pleasant and unpleasant trials in this paradigm consisted of hits respectively misses and subsequent feedback from a virtual coach. In neutral trials, participants were teleported to different spots of the virtual stadium. Results indicated increased positive affectivity (higher valence and arousal ratings, higher zygomaticus activations, and higher expression rates of positive emotional behavior) in response to pleasant compared to neutral trials. Reactions to unpleasant trials were contradictory, indicating increased levels of both positive and negative affectivity, compared to neutral trials. Unpleasant vs. neutral trials revealed lower valence ratings, higher arousal ratings, higher zygomaticus activations, slightly lower corrugator activations, and higher expression rates of both positive and negative emotional behavior. The intensity of emotional reactions correlated with experienced presence in the virtual reality.
To better understand the impact of hits or misses per se vs. hits or misses with coach feedback healthy participants’ emotional reactions, only 50% of all shots were followed by coach feedback in experiment 2. Neutral trials consisted of shots over the free soccer field which were followed by coach feedback in 50 % of all trials. Shots and feedback evoked more extreme valence and arousal ratings, higher zygomaticus activations, lower corrugator activations, and higher skin conductance responses than shots alone across emotional conditions. Again, results speak for the induction of positive emotions in pleasant trials whereas the induction of negative emotions in unpleasant trials seems ambiguous. Technical improvements of the virtual reality were reflected in higher presence ratings than in experiment 1.
Experiment 3 investigated emotional reactions of adult ADHD patients and healthy controls after emotion processing and response-focused emotion regulation. Participants successively
went through an ostensible online ball-tossing game (cyber ball) inducing negative emotions, and an adapted version of the virtual penalty kicking game. Throughout cyber ball, participants were included or ostracized by two other players in different experimental blocks. Participants were instructed to explicitly show, not regulate, or hide their emotions in different experimental blocks. Results provided some evidence for deficient processing of positive emotions in ADHD. Patients reported slightly lower positive affect than controls during cyber ball, gave lower valence ratings than controls in response to pleasant penalty kicking trials, and showed lower zygomaticus activations than controls especially during penalty kicking. Patients in comparison with controls showed slightly increased processing of unpleasant events during cyber ball (higher ratings of negative affect, especially in response to ostracism), but not during penalty kicking. Patients showed lower baseline skin conductance levels than controls, and impaired skin conductance modulations. Compared to controls, patients showed slight over-expression of positive as well as negative emotional behavior. Emotion regulation analyses revealed no major difficulties of ADHD vs. controls in altering their emotional reactions through deliberate response modulation. Moreover, patients reported to habitually apply adaptive emotion regulation strategies even more frequently than controls. The analyses of genetic high-risk vs. low-risk groups for ADHD across the whole sample revealed similar results as analyses for patients vs. controls for zygomaticus modulations during emotion processing, and for modulations of emotional reactions due to emotion regulation.
To sum up, the virtual penalty kicking paradigm proved to be successful for the induction of positive, but not negative emotions. The importance of presence in virtual reality for the intensity of induced emotions could be replicated. ADHD patients showed impaired processing of primarily positive emotions. Aberrations in negative emotional responding were less clear and need further investigation. Results point to adult ADHD in comparison to healthy controls suffering from baseline deficits in autonomic arousal and deficits in arousal modulation. Deficits of ADHD in the deliberate application of response-focused emotion regulation could not be found.