We examined whether visual and proprioceptive estimates of transient (mid-reach) target capture errors contribute to motor adaptation according to the probabilistic rules of information integration used for perception. Healthy adult humans grasped and moved a robotic handle between targets in the horizontal plane while the robot generated spring-like loads that varied unpredictably from trial-to-trial. For some trials, a visual cursor faithfully tracked hand motion. In others, the handle's position was locked and subjects viewed motion of a point-mass cursor driven by hand forces. In yet other trials, cursor feedback was dissociated from hand motion or altogether eliminated. We used time- and frequency-domain analyses to characterize how sensorimotor memories influence performance on subsequent reaches. When the senses were used separately, subjects were better at rejecting physical disturbances applied to the hand than virtual disturbances applied to the cursor. In part, this observation reflected differences in how participants used sensorimotor memories to adapt to perturbations when performance feedback was limited to only proprioceptive or visual information channels. When both vision and proprioception were available to guide movement, subjects processed memories in a manner indistinguishable from the vision-only condition regardless of whether the cursor tracked the hand faithfully or whether we experimentally dissociated motions of the hand and cursor. In contrast to perceptual tasks wherein vision and proprioception both contribute to an optimal estimate of limb state, our findings support a switched-input, multisensory model of predictive load compensation wherein visual feedback of transient performance errors overwhelmingly dominates proprioception in determining adaptive reach performance.
Available at: http://works.bepress.com/robert_scheidt/1/