Perception of springs with visual and proprioceptive motion cues: Implications for prosthetics

Netta Gurari, Katherine J. Kuchenbecker, Allison M. Okamura

Research output: Contribution to journalArticle

14 Citations (Scopus)

Abstract

Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling one's real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of- freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subject's real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.

Original languageEnglish (US)
Article number6392963
Pages (from-to)102-114
Number of pages13
JournalIEEE Transactions on Human-Machine Systems
Volume43
Issue number1
DOIs
StatePublished - Jan 1 2013

Fingerprint

Prosthetics
discrimination
performance
substitution
Conveying
stimulus
Substitution reactions
Stiffness
Prostheses and Implants
experience

Keywords

  • Force feedback
  • Maximum likelihood estimation
  • Prosthetic limbs
  • Robot motion

ASJC Scopus subject areas

  • Human Factors and Ergonomics
  • Control and Systems Engineering
  • Signal Processing
  • Human-Computer Interaction
  • Computer Science Applications
  • Computer Networks and Communications
  • Artificial Intelligence

Cite this

@article{d20106c180504b6bb187de8679d1d9ee,
title = "Perception of springs with visual and proprioceptive motion cues: Implications for prosthetics",
abstract = "Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling one's real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of- freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subject's real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.",
keywords = "Force feedback, Maximum likelihood estimation, Prosthetic limbs, Robot motion",
author = "Netta Gurari and Kuchenbecker, {Katherine J.} and Okamura, {Allison M.}",
year = "2013",
month = "1",
day = "1",
doi = "10.1109/TSMCA.2012.2221038",
language = "English (US)",
volume = "43",
pages = "102--114",
journal = "IEEE Transactions on Human-Machine Systems",
issn = "2168-2291",
publisher = "IEEE Systems, Man, and Cybernetics Society",
number = "1",

}

Perception of springs with visual and proprioceptive motion cues : Implications for prosthetics. / Gurari, Netta; Kuchenbecker, Katherine J.; Okamura, Allison M.

In: IEEE Transactions on Human-Machine Systems, Vol. 43, No. 1, 6392963, 01.01.2013, p. 102-114.

Research output: Contribution to journalArticle

TY - JOUR

T1 - Perception of springs with visual and proprioceptive motion cues

T2 - Implications for prosthetics

AU - Gurari, Netta

AU - Kuchenbecker, Katherine J.

AU - Okamura, Allison M.

PY - 2013/1/1

Y1 - 2013/1/1

N2 - Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling one's real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of- freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subject's real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.

AB - Manipulating objects with an upper limb prosthesis requires significantly more visual attention than doing the same task with an intact limb. Prior work and comments from individuals lacking proprioception indicate that conveying prosthesis motion through a nonvisual sensory channel would reduce and possibly remove the need to watch the prosthesis. To motivate the design of suitable sensory substitution devices, this study investigates the difference between seeing a virtual prosthetic limb move and feeling one's real limb move. Fifteen intact subjects controlled a virtual prosthetic finger in a one-degree-of- freedom rotational spring discrimination task. A custom haptic device was used to measure both real finger position and applied finger force, and the resulting prosthetic finger movement was displayed visually (on a computer screen) and/or proprioceptively (by allowing the subject's real finger to move). Spring discrimination performance was tested for three experimental sensory conditions-visual motion, proprioceptive motion, and visual and proprioceptive motion-using the method of constant stimuli, with a reference stiffness of 290 N/m. During each trial, subjects sequentially pressed the right index finger on a pair of hard-surfaced virtual springs and decided which was stiffer. No significant performance differences were found between the three experimental sensory conditions, but subjects perceived proprioceptive motion to be significantly more useful than visual motion. These results imply that relaying proprioceptive information through a nonvisual channel could reduce visual attention during prosthesis control while maintaining task performance, thus improving the upper limb prosthesis experience.

KW - Force feedback

KW - Maximum likelihood estimation

KW - Prosthetic limbs

KW - Robot motion

UR - http://www.scopus.com/inward/record.url?scp=84893155071&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=84893155071&partnerID=8YFLogxK

U2 - 10.1109/TSMCA.2012.2221038

DO - 10.1109/TSMCA.2012.2221038

M3 - Article

AN - SCOPUS:84893155071

VL - 43

SP - 102

EP - 114

JO - IEEE Transactions on Human-Machine Systems

JF - IEEE Transactions on Human-Machine Systems

SN - 2168-2291

IS - 1

M1 - 6392963

ER -