The representation of touch in space
Final Report Abstract
This project addressed the influence of additional information on tactile spatial perception. One of the sources of additional information are previous experiences and knowledge about the physical structure world. We tested the influence of previously acquired information on tactile localization on the hand by varying the set of possibly stimulated locations across sessions. Participants quickly learned and incorporated this information, demonstrating that recently acquired information influences tactile localization. However, at the same time, participants maintained a bias towards the center of the hand. This central bias is to be expected if participants also incorporated their long-term knowledge about tactile locations on the hand. Indeed, participants localization responses were best described by a Bayesian model that comprised two priors, one over the currently tested region of the hand and one over the center of the hand. Other sources of information that might support tactile perception are the other senses. To derive the position of a tactile stimulus in space, the brain has to integrate its location on the skin with body posture information, that is proprioception. Tactile localization is indeed posture-dependent and often declines dramatically when non-canonical postures are assumed. We tested whether human observers are able to integrate somatotopic and proprioceptive information, that is, skin and body part locations, in an optimal fashion. Though most participants performed with precision undistinguishable from optimal, they showed modality-specific biases. Unlike an ideal-observer, humans seem to incorporate somatotopic and proprioceptive reliabilities and modality-specific priors over space when calculating the location of tactile stimuli in external space. The most reliable source of spatial information is vision. Thus, the final part of the project addressed the integration of haptic, that is, dynamic tactile information with visual information. For many features, visual and haptic signals are integrated optimally, if both signals are presented in the same location. That is, if both signals seem to share a common source. We established for the first time that visual and haptic cues to roughness are indeed integrated optimally. And in the majority of participants, visual and haptic cues to slant were integrated according to optimal cue integration as well. However, though participants judged the same objects in both experiments, the degree of integration with respect to one feature could not be predicted by another feature. Thus, humans seem to base their decision on whether two signals belong to the same object on basic information such as location, but not on all available information. In sum, tactile perception is influenced by many sources of additional information. This information is often incorporated optimally, however, not always chosen optimally. In some cases, additional information that should be irrelevant is integrated and in other cases information that could be useful is neglected.
Publications
- (2016). Effect of prior knowledge on visual localization of tactile stimulation. Journal of Vision 16 (12), 1190-1190
Badde, S., Oh, H., & Landy, M.
(See online at https://dx.doi.org/10.1167/16.12.1190) - (2016). Integration of somatosensory and proprioceptive sensation for the localization of touch in visual space. Journal of Vision 16 (12), 1191-1191
Landy, M., Yang, A., & Badde, S.
(See online at https://dx.doi.org/10.1167/16.12.1191) - (2017). Effect of prior knowledge on visual localization of tactile stimulation. Poster presented at the 17th International Multisensory Research Forum, Nashville, USA
Badde, S., & Landy, M.