Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis

Nili E. Krausz*, Denys Lamotte, Iason Batzianoulis, Levi J. Hargrove, Silvestro Micera, Aude Billard

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

31 Scopus citations

Abstract

We propose a novel controller for powered prosthetic arms, where fused EMG and gaze data predict the desired end-point for a full arm prosthesis, which could drive the forward motion of individual joints. We recorded EMG, gaze, and motion-tracking during pick-and-place trials with 7 able-bodied subjects. Subjects positioned an object above a random target on a virtual interface, each completing around 600 trials. On average across all trials and subjects gaze preceded EMG and followed a repeatable pattern that allowed for prediction. A computer vision algorithm was used to extract the initial and target fixations and estimate the target position in 2D space. Two SVRs were trained with EMG data to predict the x- and y- position of the hand; results showed that the y-estimate was significantly better than the x-estimate. The EMG and gaze predictions were fused using a Kalman Filter-based approach, and the positional error from using EMG-only was significantly higher than the fusion of EMG and gaze. The final target position Root Mean Squared Error (RMSE) decreased from 9.28 cm with an EMG-only prediction to 6.94 cm when using a gaze-EMG fusion. This error also increased significantly when removing some or all arm muscle signals. However, using fused EMG and gaze, there were no significant difference between predictors that included all muscles, or only a subset of muscles.

Original languageEnglish (US)
Article number9088146
Pages (from-to)1471-1480
Number of pages10
JournalIEEE Transactions on Neural Systems and Rehabilitation Engineering
Volume28
Issue number6
DOIs
StatePublished - Jun 2020

Funding

Manuscript received December 11, 2019; revised April 13, 2020; accepted April 29, 2020. Date of publication May 6, 2020; date of current version June 5, 2020. This work was supported in part by the Swiss National Science Foundation through National Centre of Competence in Research in Robotics (NCCR Robotics) Wearable Scenario, NCCR Robotics Ph.D. Exchange Fellowship under Grant 51NF40–160592, and in part by the NIH T32 PRND Training Grant. (Nili E. Krausz and Denys Lamotte are co-first authors.) (Corresponding author: Nili E. Krausz.) Nili E. Krausz was with the Shirley Ryan AbilityLab (formerly RIC), Chicago, IL 60611 USA, and also with the LASA Laboratory, EPFL, 1015 Lausanne, Switzerland. She is now with the Rehabilitation Institute of Chicago, Northwestern University, Chicago, IL 60611 USA, and also with the Weizmann Institute of Science, Rehovot 7610001, Israel (e-mail: [email protected]).

Keywords

  • Kalman Filter
  • Prosthetics
  • computer vision
  • electromyography
  • end-point control
  • gaze tracking
  • sensory fusion
  • upper limb prosthesis

ASJC Scopus subject areas

  • Internal Medicine
  • General Neuroscience
  • Biomedical Engineering
  • Rehabilitation

Fingerprint

Dive into the research topics of 'Intent Prediction Based on Biomechanical Coordination of EMG and Vision-Filtered Gaze for End-Point Control of an Arm Prosthesis'. Together they form a unique fingerprint.

Cite this