TY - JOUR A1 - Lugrin, Jean-Luc A1 - Latoschik, Marc Erich A1 - Habel, Michael A1 - Roth, Daniel A1 - Seufert, Christian A1 - Grafe, Silke T1 - Breaking Bad Behaviors: A New Tool for Learning Classroom Management Using Virtual Reality JF - Frontiers in ICT N2 - This article presents an immersive virtual reality (VR) system for training classroom management skills, with a specific focus on learning to manage disruptive student behavior in face-to-face, one-to-many teaching scenarios. The core of the system is a real-time 3D virtual simulation of a classroom populated by twenty-four semi-autonomous virtual students. The system has been designed as a companion tool for classroom management seminars in a syllabus for primary and secondary school teachers. This will allow lecturers to link theory with practice using the medium of VR. The system is therefore designed for two users: a trainee teacher and an instructor supervising the training session. The teacher is immersed in a real-time 3D simulation of a classroom by means of a head-mounted display and headphone. The instructor operates a graphical desktop console, which renders a view of the class and the teacher whose avatar movements are captured by a marker less tracking system. This console includes a 2D graphics menu with convenient behavior and feedback control mechanisms to provide human-guided training sessions. The system is built using low-cost consumer hardware and software. Its architecture and technical design are described in detail. A first evaluation confirms its conformance to critical usability requirements (i.e., safety and comfort, believability, simplicity, acceptability, extensibility, affordability, and mobility). Our initial results are promising and constitute the necessary first step toward a possible investigation of the efficiency and effectiveness of such a system in terms of learning outcomes and experience. KW - virtual reality training KW - immersive classroom management KW - immersive classroom KW - virtual agent interaction KW - student simulation Y1 - 2016 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-147945 VL - 3 IS - 26 ER - TY - JOUR A1 - Halbig, Andreas A1 - Latoschik, Marc Erich T1 - A systematic review of physiological measurements, factors, methods, and applications in virtual reality JF - Frontiers in Virtual Reality N2 - Measurements of physiological parameters provide an objective, often non-intrusive, and (at least semi-)automatic evaluation and utilization of user behavior. In addition, specific hardware devices of Virtual Reality (VR) often ship with built-in sensors, i.e. eye-tracking and movements sensors. Hence, the combination of physiological measurements and VR applications seems promising. Several approaches have investigated the applicability and benefits of this combination for various fields of applications. However, the range of possible application fields, coupled with potentially useful and beneficial physiological parameters, types of sensor, target variables and factors, and analysis approaches and techniques is manifold. This article provides a systematic overview and an extensive state-of-the-art review of the usage of physiological measurements in VR. We identified 1,119 works that make use of physiological measurements in VR. Within these, we identified 32 approaches that focus on the classification of characteristics of experience, common in VR applications. The first part of this review categorizes the 1,119 works by field of application, i.e. therapy, training, entertainment, and communication and interaction, as well as by the specific target factors and variables measured by the physiological parameters. An additional category summarizes general VR approaches applicable to all specific fields of application since they target typical VR qualities. In the second part of this review, we analyze the target factors and variables regarding the respective methods used for an automatic analysis and, potentially, classification. For example, we highlight which measurement setups have been proven to be sensitive enough to distinguish different levels of arousal, valence, anxiety, stress, or cognitive workload in the virtual realm. This work may prove useful for all researchers wanting to use physiological data in VR and who want to have a good overview of prior approaches taken, their benefits and potential drawbacks. KW - virtual reality KW - use cases KW - sesnsors KW - tools KW - biosignals KW - psychophyisology KW - HMD (Head-Mounted Display) KW - systematic review Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-260503 VL - 2 ER - TY - JOUR A1 - Döllinger, Nina A1 - Wienrich, Carolin A1 - Latoschik, Marc Erich T1 - Challenges and opportunities of immersive technologies for mindfulness meditation: a systematic review JF - Frontiers in Virtual Reality N2 - Mindfulness is considered an important factor of an individual's subjective well-being. Consequently, Human-Computer Interaction (HCI) has investigated approaches that strengthen mindfulness, i.e., by inventing multimedia technologies to support mindfulness meditation. These approaches often use smartphones, tablets, or consumer-grade desktop systems to allow everyday usage in users' private lives or in the scope of organized therapies. Virtual, Augmented, and Mixed Reality (VR, AR, MR; in short: XR) significantly extend the design space for such approaches. XR covers a wide range of potential sensory stimulation, perceptive and cognitive manipulations, content presentation, interaction, and agency. These facilities are linked to typical XR-specific perceptions that are conceptually closely related to mindfulness research, such as (virtual) presence and (virtual) embodiment. However, a successful exploitation of XR that strengthens mindfulness requires a systematic analysis of the potential interrelation and influencing mechanisms between XR technology, its properties, factors, and phenomena and existing models and theories of the construct of mindfulness. This article reports such a systematic analysis of XR-related research from HCI and life sciences to determine the extent to which existing research frameworks on HCI and mindfulness can be applied to XR technologies, the potential of XR technologies to support mindfulness, and open research gaps. Fifty papers of ACM Digital Library and National Institutes of Health's National Library of Medicine (PubMed) with and without empirical efficacy evaluation were included in our analysis. The results reveal that at the current time, empirical research on XR-based mindfulness support mainly focuses on therapy and therapeutic outcomes. Furthermore, most of the currently investigated XR-supported mindfulness interactions are limited to vocally guided meditations within nature-inspired virtual environments. While an analysis of empirical research on those systems did not reveal differences in mindfulness compared to non-mediated mindfulness practices, various design proposals illustrate that XR has the potential to provide interactive and body-based innovations for mindfulness practice. We propose a structured approach for future work to specify and further explore the potential of XR as mindfulness-support. The resulting framework provides design guidelines for XR-based mindfulness support based on the elements and psychological mechanisms of XR interactions. KW - virtual reality KW - augmented reality KW - mindfulness KW - XR KW - meditation Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-259047 VL - 2 ER - TY - JOUR A1 - Unruh, Fabian A1 - Landeck, Maximilian A1 - Oberdörfer, Sebastian A1 - Lugrin, Jean-Luc A1 - Latoschik, Marc Erich T1 - The Influence of Avatar Embodiment on Time Perception - Towards VR for Time-Based Therapy JF - Frontiers in Virtual Reality N2 - Psycho-pathological conditions, such as depression or schizophrenia, are often accompanied by a distorted perception of time. People suffering from this conditions often report that the passage of time slows down considerably and that they are “stuck in time.” Virtual Reality (VR) could potentially help to diagnose and maybe treat such mental conditions. However, the conditions in which a VR simulation could correctly diagnose a time perception deviation are still unknown. In this paper, we present an experiment investigating the difference in time experience with and without a virtual body in VR, also known as avatar. The process of substituting a person’s body with a virtual body is called avatar embodiment. Numerous studies demonstrated interesting perceptual, emotional, behavioral, and psychological effects caused by avatar embodiment. However, the relations between time perception and avatar embodiment are still unclear. Whether or not the presence or absence of an avatar is already influencing time perception is still open to question. Therefore, we conducted a between-subjects design with and without avatar embodiment as well as a real condition (avatar vs. no-avatar vs. real). A group of 105 healthy subjects had to wait for seven and a half minutes in a room without any distractors (e.g., no window, magazine, people, decoration) or time indicators (e.g., clocks, sunlight). The virtual environment replicates the real physical environment. Participants were unaware that they will be asked to estimate their waiting time duration as well as describing their experience of the passage of time at a later stage. Our main finding shows that the presence of an avatar is leading to a significantly faster perceived passage of time. It seems to be promising to integrate avatar embodiment in future VR time-based therapy applications as they potentially could modulate a user’s perception of the passage of time. We also found no significant difference in time perception between the real and the VR conditions (avatar, no-avatar), but further research is needed to better understand this outcome. KW - virtual reality KW - time perception KW - avatar embodiment KW - immersion KW - human computer interaction (HCI) Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-259076 VL - 2 ER - TY - JOUR A1 - Oberdörfer, Sebastian A1 - Latoschik, Marc Erich T1 - Knowledge encoding in game mechanics: transfer-oriented knowledge learning in desktop-3D and VR JF - International Journal of Computer Games Technology N2 - Affine Transformations (ATs) are a complex and abstract learning content. Encoding the AT knowledge in Game Mechanics (GMs) achieves a repetitive knowledge application and audiovisual demonstration. Playing a serious game providing these GMs leads to motivating and effective knowledge learning. Using immersive Virtual Reality (VR) has the potential to even further increase the serious game’s learning outcome and learning quality. This paper compares the effectiveness and efficiency of desktop-3D and VR in respect to the achieved learning outcome. Also, the present study analyzes the effectiveness of an enhanced audiovisual knowledge encoding and the provision of a debriefing system. The results validate the effectiveness of the knowledge encoding in GMs to achieve knowledge learning. The study also indicates that VR is beneficial for the overall learning quality and that an enhanced audiovisual encoding has only a limited effect on the learning outcome. KW - games Y1 - 2019 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-201159 VL - 2019 ER - TY - JOUR A1 - Glémarec, Yann A1 - Lugrin, Jean-Luc A1 - Bosser, Anne-Gwenn A1 - Buche, Cédric A1 - Latoschik, Marc Erich T1 - Controlling the stage: a high-level control system for virtual audiences in Virtual Reality JF - Frontiers in Virtual Reality N2 - This article presents a novel method for controlling a virtual audience system (VAS) in Virtual Reality (VR) application, called STAGE, which has been originally designed for supervised public speaking training in university seminars dedicated to the preparation and delivery of scientific talks. We are interested in creating pedagogical narratives: narratives encompass affective phenomenon and rather than organizing events changing the course of a training scenario, pedagogical plans using our system focus on organizing the affects it arouses for the trainees. Efficiently controlling a virtual audience towards a specific training objective while evaluating the speaker’s performance presents a challenge for a seminar instructor: the high level of cognitive and physical demands required to be able to control the virtual audience, whilst evaluating speaker’s performance, adjusting and allowing it to quickly react to the user’s behaviors and interactions. It is indeed a critical limitation of a number of existing systems that they rely on a Wizard of Oz approach, where the tutor drives the audience in reaction to the user’s performance. We address this problem by integrating with a VAS a high-level control component for tutors, which allows using predefined audience behavior rules, defining custom ones, as well as intervening during run-time for finer control of the unfolding of the pedagogical plan. At its core, this component offers a tool to program, select, modify and monitor interactive training narratives using a high-level representation. The STAGE offers the following features: i) a high-level API to program pedagogical narratives focusing on a specific public speaking situation and training objectives, ii) an interactive visualization interface iii) computation and visualization of user metrics, iv) a semi-autonomous virtual audience composed of virtual spectators with automatic reactions to the speaker and surrounding spectators while following the pedagogical plan V) and the possibility for the instructor to embody a virtual spectator to ask questions or guide the speaker from within the Virtual Environment. We present here the design, and implementation of the tutoring system and its integration in STAGE, and discuss its reception by end-users. KW - virtual reality KW - virtual agent KW - behavior perception KW - public speaking KW - education Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-284601 SN - 2673-4192 VL - 3 ER - TY - JOUR A1 - Hein, Rebecca M. A1 - Latoschik, Marc Erich A1 - Wienrich, Carolin T1 - Inter- and transcultural learning in cocial virtual reality: a proposal for an inter- and transcultural virtual object database to be used in the implementation, reflection, and evaluation of virtual encounters JF - Multimodal Technologies and Interaction N2 - Visual stimuli are frequently used to improve memory, language learning or perception, and understanding of metacognitive processes. However, in virtual reality (VR), there are few systematically and empirically derived databases. This paper proposes the first collection of virtual objects based on empirical evaluation for inter-and transcultural encounters between English- and German-speaking learners. We used explicit and implicit measurement methods to identify cultural associations and the degree of stereotypical perception for each virtual stimuli (n = 293) through two online studies, including native German and English-speaking participants. The analysis resulted in a final well-describable database of 128 objects (called InteractionSuitcase). In future applications, the objects can be used as a great interaction or conversation asset and behavioral measurement tool in social VR applications, especially in the field of foreign language education. For example, encounters can use the objects to describe their culture, or teachers can intuitively assess stereotyped attitudes of the encounters. KW - virtual stimuli KW - implicit association test KW - virtual reality KW - social VR KW - InteractionSuitcase Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-278974 SN - 2414-4088 VL - 6 IS - 7 ER - TY - JOUR A1 - Zimmerer, Chris A1 - Fischbach, Martin A1 - Latoschik, Marc Erich T1 - Semantic Fusion for Natural Multimodal Interfaces using Concurrent Augmented Transition Networks JF - Multimodal Technologies and Interaction N2 - Semantic fusion is a central requirement of many multimodal interfaces. Procedural methods like finite-state transducers and augmented transition networks have proven to be beneficial to implement semantic fusion. They are compliant with rapid development cycles that are common for the development of user interfaces, in contrast to machine-learning approaches that require time-costly training and optimization. We identify seven fundamental requirements for the implementation of semantic fusion: Action derivation, continuous feedback, context-sensitivity, temporal relation support, access to the interaction context, as well as the support of chronologically unsorted and probabilistic input. A subsequent analysis reveals, however, that there is currently no solution for fulfilling the latter two requirements. As the main contribution of this article, we thus present the Concurrent Cursor concept to compensate these shortcomings. In addition, we showcase a reference implementation, the Concurrent Augmented Transition Network (cATN), that validates the concept’s feasibility in a series of proof of concept demonstrations as well as through a comparative benchmark. The cATN fulfills all identified requirements and fills the lack amongst previous solutions. It supports the rapid prototyping of multimodal interfaces by means of five concrete traits: Its declarative nature, the recursiveness of the underlying transition network, the network abstraction constructs of its description language, the utilized semantic queries, and an abstraction layer for lexical information. Our reference implementation was and is used in various student projects, theses, as well as master-level courses. It is openly available and showcases that non-experts can effectively implement multimodal interfaces, even for non-trivial applications in mixed and virtual reality. KW - multimodal fusion KW - multimodal interface KW - semantic fusion KW - procedural fusion methods KW - natural interfaces KW - human-computer interaction Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-197573 SN - 2414-4088 VL - 2 IS - 4 ER - TY - JOUR A1 - Oberdörfer, Sebastian A1 - Birnstiel, Sandra A1 - Latoschik, Marc Erich A1 - Grafe, Silke T1 - Mutual Benefits: Interdisciplinary Education of Pre-Service Teachers and HCI Students in VR/AR Learning Environment Design JF - Frontiers in Education N2 - The successful development and classroom integration of Virtual (VR) and Augmented Reality (AR) learning environments requires competencies and content knowledge with respect to media didactics and the respective technologies. The paper discusses a pedagogical concept specifically aiming at the interdisciplinary education of pre-service teachers in collaboration with human-computer interaction students. The students’ overarching goal is the interdisciplinary realization and integration of VR/AR learning environments in teaching and learning concepts. To assist this approach, we developed a specific tutorial guiding the developmental process. We evaluate and validate the effectiveness of the overall pedagogical concept by analyzing the change in attitudes regarding 1) the use of VR/AR for educational purposes and in competencies and content knowledge regarding 2) media didactics and 3) technology. Our results indicate a significant improvement in the knowledge of media didactics and technology. We further report on four STEM learning environments that have been developed during the seminar. KW - interdisciplinary education KW - virtual reality KW - augmented reality KW - serious games KW - learning environments KW - teacher education Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-241612 SN - 2504-284X VL - 6 ER - TY - JOUR A1 - Kern, Florian A1 - Kullmann, Peter A1 - Ganal, Elisabeth A1 - Korwisi, Kristof A1 - Stingl, René A1 - Niebling, Florian A1 - Latoschik, Marc Erich T1 - Off-The-Shelf Stylus: Using XR Devices for Handwriting and Sketching on Physically Aligned Virtual Surfaces JF - Frontiers in Virtual Reality N2 - This article introduces the Off-The-Shelf Stylus (OTSS), a framework for 2D interaction (in 3D) as well as for handwriting and sketching with digital pen, ink, and paper on physically aligned virtual surfaces in Virtual, Augmented, and Mixed Reality (VR, AR, MR: XR for short). OTSS supports self-made XR styluses based on consumer-grade six-degrees-of-freedom XR controllers and commercially available styluses. The framework provides separate modules for three basic but vital features: 1) The stylus module provides stylus construction and calibration features. 2) The surface module provides surface calibration and visual feedback features for virtual-physical 2D surface alignment using our so-called 3ViSuAl procedure, and surface interaction features. 3) The evaluation suite provides a comprehensive test bed combining technical measurements for precision, accuracy, and latency with extensive usability evaluations including handwriting and sketching tasks based on established visuomotor, graphomotor, and handwriting research. The framework’s development is accompanied by an extensive open source reference implementation targeting the Unity game engine using an Oculus Rift S headset and Oculus Touch controllers. The development compares three low-cost and low-tech options to equip controllers with a tip and includes a web browser-based surface providing support for interacting, handwriting, and sketching. The evaluation of the reference implementation based on the OTSS framework identified an average stylus precision of 0.98 mm (SD = 0.54 mm) and an average surface accuracy of 0.60 mm (SD = 0.32 mm) in a seated VR environment. The time for displaying the stylus movement as digital ink on the web browser surface in VR was 79.40 ms on average (SD = 23.26 ms), including the physical controller’s motion-to-photon latency visualized by its virtual representation (M = 42.57 ms, SD = 15.70 ms). The usability evaluation (N = 10) revealed a low task load, high usability, and high user experience. Participants successfully reproduced given shapes and created legible handwriting, indicating that the OTSS and it’s reference implementation is ready for everyday use. We provide source code access to our implementation, including stylus and surface calibration and surface interaction features, making it easy to reuse, extend, adapt and/or replicate previous results (https://go.uniwue.de/hci-otss). KW - virtual reality KW - augmented reality KW - handwriting KW - sketching KW - stylus KW - user interaction KW - usability evaluation KW - passive haptic feedback Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-260219 VL - 2 ER -