Project Details
Multisensory integration, sensory recalibration, and the reduction of cross-modal perceptual discrepancies
Applicant
Professor Dr. Christoph Kayser
Subject Area
General, Cognitive and Mathematical Psychology
Term
from 2019 to 2023
Project identifier
Deutsche Forschungsgemeinschaft (DFG) - Project number 425795980
We experience the environment by combining information across our different senses. Often, these provide at least somewhat discrepant evidence, and our brain is faced with the challenge to merge this information into a coherent and unified percept. For example, when visual and auditory stimuli are presented at different locations, or when a hand movement ends at a location different from the one perceived visually, spatial biases influence our percepts. That is, the perceived location of an event deviates from the physical reality, or from what would be perceived if only a single stimulus had been present. Such cross-modal biases can arise from the discrepant multisensory information available at any given moment, a process known as multisensory integration. However, such biases can also arise from the adaptation to discrepancies in the multisensory environment that persist for minutes or longer, a process known as recalibration. While in typical experimental settings integration and recalibration emerge on distinct time scales, they are both central mechanisms to reduce apparent discrepancies in the available sensory information. We here propose to investigate the perceptual mechanisms underlying multisensory integration and recalibration, and how these interact to collectively shape perception. We focus on two paradigms: one relying on passive audio-visual perception, the other capitalizing the active interaction with the environment via hand movements under visual feedback (visuo-motor task), in order to distinguish results that generalize across modalities and the nature of the perceptual task from paradigm-specific results. We focus on three key hypotheses derived from previous work, each to be tested in a separate series of experiments: i) both processes are modulated in parallel by the observer’s belief that the apparently discrepant information actually arises from a common object; ii) both processes are affected in parallel by the relative reliabilities of the two modalities; and iii) integration and recalibration both work to ensure a coherent percept, but each by removing sensory discrepancies on distinct time scales. Knowledge of mechanisms that guide the interplay between multisensory integration and recalibration is paramount to understand how the brain flexibly adapts to the often volatile sensory evidence in real-life environments. In turn, this understanding can then be used to uncover the underlying neural mechanisms in the brain, and how perceptual challenges affect individuals with sensory disabilities, specific disorders, or during cognitive decline.
DFG Programme
Research Grants
Co-Investigator
Professor Dr. Herbert Heuer