TY - JOUR A1 - Klaffehn, Annika L. A1 - Sellmann, Florian B. A1 - Kirsch, Wladimir A1 - Kunde, Wilfried A1 - Pfister, Roland T1 - Temporal binding as multisensory integration: Manipulating perceptual certainty of actions and their effects JF - Attention, Perception & Psychophysics N2 - It has been proposed that statistical integration of multisensory cues may be a suitable framework to explain temporal binding, that is, the finding that causally related events such as an action and its effect are perceived to be shifted towards each other in time. A multisensory approach to temporal binding construes actions and effects as individual sensory signals, which are each perceived with a specific temporal precision. When they are integrated into one multimodal event, like an action-effect chain, the extent to which they affect this event's perception depends on their relative reliability. We test whether this assumption holds true in a temporal binding task by manipulating certainty of actions and effects. Two experiments suggest that a relatively uncertain sensory signal in such action-effect sequences is shifted more towards its counterpart than a relatively certain one. This was especially pronounced for temporal binding of the action towards its effect but could also be shown for effect binding. Other conceptual approaches to temporal binding cannot easily explain these results, and the study therefore adds to the growing body of evidence endorsing a multisensory approach to temporal binding. KW - temporal processing KW - perception and action KW - multisensory processing Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-273195 SN - 1943-393X VL - 83 IS - 8 ER - TY - JOUR A1 - Kirsch, Wladimir T1 - On the relevance of task instructions for the influence of action on perception JF - Attention, Perception & Psychophysics N2 - The present study explored how task instructions mediate the impact of action on perception. Participants saw a target object while performing finger movements. Then either the size of the target or the size of the adopted finger postures was judged. The target judgment was attracted by the adopted finger posture indicating sensory integration of body-related and visual signals. The magnitude of integration, however, depended on how the task was initially described. It was substantially larger when the experimental instructions indicated that finger movements and the target object relate to the same event than when they suggested that they are unrelated. This outcome highlights the role of causal inference processes in the emergence of action specific influences in perception. KW - perception and action KW - multisensory processing KW - finger movements Y1 - 2021 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-273185 SN - 1943-393X VL - 83 IS - 6 ER - TY - JOUR A1 - Ehrenfeld, Stephan A1 - Herbort, Oliver A1 - Butz, Martin V. T1 - Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference JF - Frontiers in Computational Neuroscience N2 - This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control. KW - information KW - posterior parietal cortex KW - hand KW - population code KW - conflicting information KW - multimodal interaction KW - probabilistic inference KW - modular body schema KW - sensor fusion KW - multisensory perception KW - fusion KW - representation KW - multisensory processing KW - see KW - implementation KW - perspective KW - multisensory integration KW - population codes Y1 - 2013 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-122253 VL - 7 IS - 148 ER - TY - JOUR A1 - Cao, Liyu A1 - Steinborn, Michael A1 - Kunde, Wilfried A1 - Haendel, Barbara T1 - Action force modulates action binding: evidence for a multisensory information integration explanation JF - Experimental Brain Research N2 - Action binding refers to the observation that the perceived time of an action (e.g., a keypress) is shifted towards the distal sensory feedback (usually a sound) triggered by that action. Surprisingly, the role of somatosensory feedback for this phe-nomenon has been largely ignored. We fill this gap by showing that the somatosensory feedback, indexed by keypress peak force, is functional in judging keypress time. Specifically, the strength of somatosensory feedback is positively correlated with reported keypress time when the keypress is not associated with an auditory feedback and negatively correlated when the keypress triggers an auditory feedback. The result is consistent with the view that the reported keypress time is shaped by sensory information from different modalities. Moreover, individual differences in action binding can be explained by a sensory information weighting between somatosensory and auditory feedback. At the group level, increasing the strength of somatosensory feedback can decrease action binding to a level not being detected statistically. Therefore, a multisensory information integration account (between somatosensory and auditory inputs) explains action binding at both a group level and an individual level. KW - action binding KW - force KW - somatosensory feedback KW - multisensory processing Y1 - 2020 U6 - http://nbn-resolving.de/urn/resolver.pl?urn:nbn:de:bvb:20-opus-232534 SN - 0014-4819 VL - 238 ER -