Modular neuron-based body estimation: maintaining consistency over different limbs, modalities, and frames of reference

Please always quote using this URN: urn:nbn:de:bvb:20-opus-122253
  • This paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sourcesThis paper addresses the question of how the brain maintains a probabilistic body state estimate over time from a modeling perspective. The neural Modular Modality Frame (nMMF) model simulates such a body state estimation process by continuously integrating redundant, multimodal body state information sources. The body state estimate itself is distributed over separate, but bidirectionally interacting modules. nMMF compares the incoming sensory and present body state information across the interacting modules and fuses the information sources accordingly. At the same time, nMMF enforces body state estimation consistency across the modules. nMMF is able to detect conflicting sensory information and to consequently decrease the influence of implausible sensor sources on the fly. In contrast to the previously published Modular Modality Frame (MMF) model, nMMF offers a biologically plausible neural implementation based on distributed, probabilistic population codes. Besides its neural plausibility, the neural encoding has the advantage of enabling (a) additional probabilistic information flow across the separate body state estimation modules and (b) the representation of arbitrary probability distributions of a body state. The results show that the neural estimates can detect and decrease the impact of false sensory information, can propagate conflicting information across modules, and can improve overall estimation accuracy due to additional module interactions. Even bodily illusions, such as the rubber hand illusion, can be simulated with nMMF. We conclude with an outlook on the potential of modeling human data and of invoking goal-directed behavioral control.show moreshow less

Download full text files

Export metadata

Additional Services

Share in Twitter Search Google Scholar Statistics
Metadaten
Author: Stephan Ehrenfeld, Oliver Herbort, Martin V. Butz
URN:urn:nbn:de:bvb:20-opus-122253
Document Type:Journal article
Faculties:Fakultät für Humanwissenschaften (Philos., Psycho., Erziehungs- u. Gesell.-Wissensch.) / Institut für Psychologie
Language:English
Parent Title (English):Frontiers in Computational Neuroscience
Year of Completion:2013
Volume:7
Issue:148
Source:Frontiers in Computational Neuroscience 7:148. doi:10.3389/fncom. 2013.00148
DOI:https://doi.org/10.3389/fncom.2013.00148
Dewey Decimal Classification:1 Philosophie und Psychologie / 15 Psychologie / 150 Psychologie
Tag:conflicting information; fusion; hand; implementation; information; modular body schema; multimodal interaction; multisensory integration; multisensory perception; multisensory processing; perspective; population code; population codes; posterior parietal cortex; probabilistic inference; representation; see; sensor fusion
Release Date:2016/02/25
Licence (German):License LogoCC BY: Creative-Commons-Lizenz: Namensnennung