Stiffness discrimination with visual and proprioceptive cues

Netta Gurari*, Katherine J. Kuchenbecker, Allison M. Okamura

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

28 Scopus citations

Abstract

This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.

Original languageEnglish (US)
Title of host publicationProceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009
Pages121-126
Number of pages6
DOIs
StatePublished - Jul 8 2009
Event3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009 - Salt Lake City, UT, United States
Duration: Mar 18 2009Mar 20 2009

Publication series

NameProceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009

Other

Other3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009
CountryUnited States
CitySalt Lake City, UT
Period3/18/093/20/09

    Fingerprint

Keywords

  • H.1.2 [models and principles]
  • H.5.1 [information interfaces and presentation]
  • J.4 [social and behavioral sciences]
  • Psychology
  • User interfaces-haptic I/O
  • User/machine systems-human factors

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

Cite this

Gurari, N., Kuchenbecker, K. J., & Okamura, A. M. (2009). Stiffness discrimination with visual and proprioceptive cues. In Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009 (pp. 121-126). [4810845] (Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009). https://doi.org/10.1109/WHC.2009.4810845