Auge-Hand-Koordination für die Steuerung von Handbewegungen
Zusammenfassung der Projektergebnisse
Eye and hand movements are highly coordinated during various goal-directed manual actions performed in daily and occupational life. Among the eyes and hands, the hands are the only effector that can interact with extra-personal space, and thus eye movements are controlled to facilitate effective hand movements. In a series of experiments, we showed that gaze control is related to specific processes of controlling various manual actions (discrete, sequential, transformed, and continuous movements). A commonly observed gaze pattern in discrete and sequential reaches is gaze anchoring to a target (task goal or sub-goal) during the reaches. This gaze behavior is regulated depending on multimodal (visual, proprioceptive, auditory) assessments of reach completion and planning of accuracy and amplitude of the subsequent reach. Surprisingly, the behavior is largely unaffected by the microstructure of reach (ballistic and corrective sub-movements). Spatial and temporal eye-hand correlation at important landmarks of manual actions is critical and heavily regulated. Such landmarks are the start and the goal (or sub-goal) of actions in discrete and sequential movements, and are low velocity points of continuous hand movements. The spatiotemporal correlation between gaze locations and hand movements allows an alignment of visual and proprioceptive signals for verification of current action progress and planning of the next action phase toward the successful completion of complex actions. The gaze anchoring to a target is established through learning of a novel visuomotor transformation using on-line visual feedback. Gaze is anchored to visual feedback during unpredictable reaches in the early learning phase, but to the target during predictable reaches in the late learning phase. The adaptive change of gaze location reflects a functional change of gaze control from a reactive control for exploring the hand-feedback relation to a predictive control for guiding the hand to the task goal. This adaptive pattern is robust and used in various types of control of hand movements during learning (implicit adaptation, explicit strategic adjustment, feedback based control, and joint-level control). The speed of establishing the gaze anchoring is affected by the difficulty of visuomotor transformations and hemispaces used for reaches, but not by reach trajectory and the difficulty of arm joint coordination. The greater generality of this adaptive gaze pattern implemented by the oculomotor system compared to more diverse control strategies used for manual actions is a surprise finding. The oculomotor system’s primarily concern for the adaptive gaze control during the learning seems to be predictability of the outcome of reaches, thereby being indifferent to the limb-motor system’s choice of control strategies. Another adaptive gaze pattern emerges when off-line terminal visual feedback is used instead of on-line feedback for learning of a transformed movement. Namely, gaze locations are gradually changed from the target to the final position of reaches, reflecting the oculomotor system’s involvement in the preplanning of reaches. In this case, gaze location is critical for explicit strategic adjustments of reaches but not for implicit adaptation. Taken together, the oculomotor control is an integral component of the control mechanisms that regulate various complex manual actions in a variety of behavioral contexts by providing necessary visual information in specific location and time to facilitate planning, execution, and verification of manual actions.
Projektbezogene Publikationen (Auswahl)
-
(2019) Effects of Hand and Hemispace on Multisensory Integration of Hand Position and Visual Feedback. Frontiers in psychology 10 237
Rand, M.K., & Heuer, H.
-
(2013). Implicit and explicit representations of hand position in tool use. PLoS ONE, 8, e68471
Rand, M.K. & Heuer, H.
-
(2014). Eye-hand coordination during visuomotor adaptation with different rotation angles. PLoS ONE, 9, e109819
Rentsch, S., & Rand, M.K.
-
(2014). Segment interdependency and gaze anchoring during manual two-segment sequences. Experimental Brain Research, 232, 2753-2765
Rand, M.K.
-
(2015). Gaze locations affect explicit process but not implicit process during visuomotor adaptation. Journal of Neurophysiology, 113, 88-99
Rand, M.K., & Rentsch, S.
-
(2016). Effects of reliability and global context on explicit and implicit measures of sensed hand position in cursor-control tasks. Frontiers in Psychology, 6, 2056
Rand, M.K., & Heuer, H.
-
(2016). Eye-hand coordination during visuomotor adaptation with different rotation angles: Effects of terminal visual feedback. PLoS ONE, 11, e0164602
Rand, M.K., & Rentsch, S.
-
(2017). Contrasting effects of adaptation to a visuomotor rotation on explicit and implicit measures of sensory coupling. Psychological Research
Rand M.K., Heuer H.
-
(2017). Eye-hand coordination during visuomotor adaptation: effects of hemispace and joint coordination. Experimental Brain Research, 235, 3645-3661
Rand, M.K., & Rentsch, S.
-
(2018). Dissociating explicit and implicit measures of sensed hand position in tool use: Effect of relative frequency of judging different objects. Attention, Perception, & Psychophysics. 80, 211-221
Rand M.K., Heuer H.
-
(2018). Effects of auditory feedback on movements with two-segment sequence and eye–hand coordination. Experimental Brain Research. 236, 3131–3148
Rand M.K.
-
(2018). Sensorimotor integration associated with transport-aperture coordination and tool-mediated reaching. In: Corbetta, D., & Santello, M. (eds.), Reach-to-Grasp Behavior: Brain, Behavior, and Modelling Across the Life Span. New York: Routledge, pp. 225-255
Rand, M.K., & Shimansky, Y.P.