People tend to make straight and smooth hand movements when reaching for an object. These trajectory features are resistant to perturbation, and both proprioceptive as well as visual feedback may guide the adaptive updating of motor commands enforcing this regularity. How is information from the two senses combined to generate a coherent internal representation of how the arm moves? Here we show that eliminating visual feedback of hand-path deviations from the straight-line reach (constraining visual feedback of motion within a virtual, "visual channel") prevents compensation of initial direction errors induced by perturbations. Because adaptive reduction in direction errors occurred with proprioception alone, proprioceptive and visual information are not combined in this reaching task using a fixed, linear weighting scheme as reported for static tasks not requiring arm motion. A computer model can explain these findings, assuming that proprioceptive estimates of initial limb posture are used to select motor commands for a desired reach and visual feedback of hand-path errors brings proprioceptive estimates into registration with a visuocentric representation of limb position relative to its target. Simulations demonstrate that initial configuration estimation errors lead to movement direction errors as observed experimentally. Registration improves movement accuracy when veridical visual feedback is provided but is not invoked when hand-path errors are eliminated. However, the visual channel did not exclude adjustment of terminal movement features maximizing handpath smoothness. Thus visual and proprioceptive feedback may be combined in fundamentally different ways during trajectory control and final position regulation of reaching movements.
ASJC Scopus subject areas