This research seeks to ascertain the relative value of visual and proprioceptive motion feedback during farce-based control of a non-self entity like a powered prosthesis. Accurately controlling such a device is very difficult when the operator cannot see or feel the movement that results from applied forces. As an analogy to prosthesis use, we tested the relative importance of visual and proprioceptive motion feedback during targeted force-based movement. Thirteen human subjects performed a virtual finger-pointing task in which the virtual finger's velocity was always programmed to be directly proportional to the MCP joint torque applied by the subject's right index finger. During successive repetitions of the pointing task, the system conveyed the virtual finger's motion to the user through four combinations of graphical display (vision) and fiuget movement (proprioception). Success rate, speed, and qualitative ease of use Were recorded, and visual motion feedback was found to increase all three performance measures. Proprioceptive motion feedback significantly improved success rate and ease of use, but it yielded slower motions. The results indicate that proprioceptive motion feedback improves human control of targeted movement in both sighted and unsighted conditions, supporting the pursuit of artificial proprioception for prosthetics and underscoring the importance of motion feedback for Other force-controlled human-machine systems, such as interactive virtual environments and teleoperators.