Stiffness discrimination with visual and proprioceptive cues

Netta Gurari*, Katherine J. Kuchenbecker, Allison M. Okamura

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Citations (Scopus)

Abstract

This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.

Original languageEnglish (US)
Title of host publicationProceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009
Pages121-126
Number of pages6
DOIs
StatePublished - Jul 8 2009
Event3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009 - Salt Lake City, UT, United States
Duration: Mar 18 2009Mar 20 2009

Publication series

NameProceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009

Other

Other3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009
CountryUnited States
CitySalt Lake City, UT
Period3/18/093/20/09

Fingerprint

Stiffness
Feedback
Haptic interfaces
Force measurement
Prosthetics
Virtual reality
Torque
Display devices
Availability
Controllers
Experiments

Keywords

  • H.1.2 [models and principles]
  • H.5.1 [information interfaces and presentation]
  • J.4 [social and behavioral sciences]
  • Psychology
  • User interfaces-haptic I/O
  • User/machine systems-human factors

ASJC Scopus subject areas

  • Artificial Intelligence
  • Human-Computer Interaction
  • Control and Systems Engineering

Cite this

Gurari, N., Kuchenbecker, K. J., & Okamura, A. M. (2009). Stiffness discrimination with visual and proprioceptive cues. In Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009 (pp. 121-126). [4810845] (Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009). https://doi.org/10.1109/WHC.2009.4810845
Gurari, Netta ; Kuchenbecker, Katherine J. ; Okamura, Allison M. / Stiffness discrimination with visual and proprioceptive cues. Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009. 2009. pp. 121-126 (Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009).
@inproceedings{37504a79eac449e5bffc82bc99b56eb1,
title = "Stiffness discrimination with visual and proprioceptive cues",
abstract = "This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.",
keywords = "H.1.2 [models and principles], H.5.1 [information interfaces and presentation], J.4 [social and behavioral sciences], Psychology, User interfaces-haptic I/O, User/machine systems-human factors",
author = "Netta Gurari and Kuchenbecker, {Katherine J.} and Okamura, {Allison M.}",
year = "2009",
month = "7",
day = "8",
doi = "10.1109/WHC.2009.4810845",
language = "English (US)",
isbn = "9781424438587",
series = "Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009",
pages = "121--126",
booktitle = "Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009",

}

Gurari, N, Kuchenbecker, KJ & Okamura, AM 2009, Stiffness discrimination with visual and proprioceptive cues. in Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009., 4810845, Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009, pp. 121-126, 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009, Salt Lake City, UT, United States, 3/18/09. https://doi.org/10.1109/WHC.2009.4810845

Stiffness discrimination with visual and proprioceptive cues. / Gurari, Netta; Kuchenbecker, Katherine J.; Okamura, Allison M.

Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009. 2009. p. 121-126 4810845 (Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - Stiffness discrimination with visual and proprioceptive cues

AU - Gurari, Netta

AU - Kuchenbecker, Katherine J.

AU - Okamura, Allison M.

PY - 2009/7/8

Y1 - 2009/7/8

N2 - This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.

AB - This study compares the Weber fraction for human perception of stiffness among three conditions: vision, proprioceptive motion feedback, and their combination. To make comparisons between these feedback conditions, a novel haptic device was designed that senses the spring behavior through encoder and force measurements, and implements a controller to render linear virtual springs so that the stimuli displayed haptically could be compared with their visual counterparts. The custom-designed, torque-controlled haptic interface non-invasively controls the availability of proprioceptive motion feedback in unimpaired individuals using a virtual environment. When proprioception is available, the user feels an MCP joint rotation that is proportional to his or her finger force. When proprioception is not available, the actual finger is not allowed to move, but a virtual finger displayed graphically moves in proportion to the user's applied force. Visual feedback is provided and removed by turning on and off this graphical display. Weber fractions were generated from an experiment in which users examined pairs of springs and attempted to identify the spring with higher stiffness. To account for slight trial-to-trial variations in the relationship between force and position in the proprioceptive feedback conditions, our analysis uses measurements of the actual rendered stiffness, rather than the commanded stiffness. Results for 10 users give average Weber fractions of 0.056 for vision, 0.036 for proprioception, and 0.039 for their combination, indicating that proprioception is important for stiffness perception for this experimental setup. The long-term goal of this research is to motivate and develop methods for proprioception feedback to wearers of dexterous upper-limb prostheses.

KW - H.1.2 [models and principles]

KW - H.5.1 [information interfaces and presentation]

KW - J.4 [social and behavioral sciences]

KW - Psychology

KW - User interfaces-haptic I/O

KW - User/machine systems-human factors

UR - http://www.scopus.com/inward/record.url?scp=67649658387&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=67649658387&partnerID=8YFLogxK

U2 - 10.1109/WHC.2009.4810845

DO - 10.1109/WHC.2009.4810845

M3 - Conference contribution

AN - SCOPUS:67649658387

SN - 9781424438587

T3 - Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009

SP - 121

EP - 126

BT - Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009

ER -

Gurari N, Kuchenbecker KJ, Okamura AM. Stiffness discrimination with visual and proprioceptive cues. In Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009. 2009. p. 121-126. 4810845. (Proceedings - 3rd Joint EuroHaptics Conference and Symposium on Haptic Interfaces for Virtual Environment and Teleoperator Systems, World Haptics 2009). https://doi.org/10.1109/WHC.2009.4810845