TY - JOUR A1 - Palmisano, Chiara A1 - Kullmann, Peter A1 - Hanafi, Ibrahem A1 - Verrecchia, Marta A1 - Latoschik, Marc Erich A1 - Canessa, Andrea A1 - Fischbach, Martin A1 - Isaias, Ioannis Ugo T1 - A fully-immersive virtual reality setup to study gait modulation JF - Frontiers in Human Neuroscience N2 - Objective: Gait adaptation to environmental challenges is fundamental for independent and safe community ambulation. The possibility of precisely studying gait modulation using standardized protocols of gait analysis closely resembling everyday life scenarios is still an unmet need. Methods: We have developed a fully-immersive virtual reality (VR) environment where subjects have to adjust their walking pattern to avoid collision with a virtual agent (VA) crossing their gait trajectory. We collected kinematic data of 12 healthy young subjects walking in real world (RW) and in the VR environment, both with (VR/A+) and without (VR/A-) the VA perturbation. The VR environment closely resembled the RW scenario of the gait laboratory. To ensure standardization of the obstacle presentation the starting time speed and trajectory of the VA were defined using the kinematics of the participant as detected online during each walking trial. Results: We did not observe kinematic differences between walking in RW and VR/A-, suggesting that our VR environment per se might not induce significant changes in the locomotor pattern. When facing the VA all subjects consistently reduced stride length and velocity while increasing stride duration. Trunk inclination and mediolateral trajectory deviation also facilitated avoidance of the obstacle. Conclusions: This proof-of-concept study shows that our VR/A+ paradigm effectively induced a timely gait modulation in a standardized immersive and realistic scenario. This protocol could be a powerful research tool to study gait modulation and its derangements in relation to aging and clinical conditions. KW - gait modulation KW - virtual reality KW - obstacle avoidance KW - gait analysis KW - kinematics Y1 - 2022 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-267099 SN - 1662-5161 VL - 16 ER - TY - JOUR A1 - Zimmerer, Chris A1 - Fischbach, Martin A1 - Latoschik, Marc Erich T1 - Semantic Fusion for Natural Multimodal Interfaces using Concurrent Augmented Transition Networks JF - Multimodal Technologies and Interaction N2 - Semantic fusion is a central requirement of many multimodal interfaces. Procedural methods like finite-state transducers and augmented transition networks have proven to be beneficial to implement semantic fusion. They are compliant with rapid development cycles that are common for the development of user interfaces, in contrast to machine-learning approaches that require time-costly training and optimization. We identify seven fundamental requirements for the implementation of semantic fusion: Action derivation, continuous feedback, context-sensitivity, temporal relation support, access to the interaction context, as well as the support of chronologically unsorted and probabilistic input. A subsequent analysis reveals, however, that there is currently no solution for fulfilling the latter two requirements. As the main contribution of this article, we thus present the Concurrent Cursor concept to compensate these shortcomings. In addition, we showcase a reference implementation, the Concurrent Augmented Transition Network (cATN), that validates the concept’s feasibility in a series of proof of concept demonstrations as well as through a comparative benchmark. The cATN fulfills all identified requirements and fills the lack amongst previous solutions. It supports the rapid prototyping of multimodal interfaces by means of five concrete traits: Its declarative nature, the recursiveness of the underlying transition network, the network abstraction constructs of its description language, the utilized semantic queries, and an abstraction layer for lexical information. Our reference implementation was and is used in various student projects, theses, as well as master-level courses. It is openly available and showcases that non-experts can effectively implement multimodal interfaces, even for non-trivial applications in mixed and virtual reality. KW - multimodal fusion KW - multimodal interface KW - semantic fusion KW - procedural fusion methods KW - natural interfaces KW - human-computer interaction Y1 - 2018 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-197573 SN - 2414-4088 VL - 2 IS - 4 ER -