Decoding with limited neural data: A mixture of time-warped trajectory models for directional reaches

Elaine A. Corbett*, Eric J. Perreault, Konrad P. Körding

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

19 Scopus citations

Abstract

Neuroprosthetic devices promise to allow paralyzed patients to perform the necessary functions of everyday life. However, to allow patients to use such tools it is necessary to decode their intent from neural signals such as electromyograms (EMGs). Because these signals are noisy, state of the art decoders integrate information over time. One systematic way of doing this is by taking into account the natural evolution of the state of the body - by using a so-called trajectory model. Here we use two insights about movements to enhance our trajectory model: (1) at any given time, there is a small set of likely movement targets, potentially identified by gaze; (2) reaches are produced at varying speeds. We decoded natural reaching movements using EMGs of muscles that might be available from an individual with spinal cord injury. Target estimates found from tracking eye movements were incorporated into the trajectory model, while a mixture model accounted for the inherent uncertainty in these estimates. Warping the trajectory model in time using a continuous estimate of the reach speed enabled accurate decoding of faster reaches. We found that the choice of richer trajectory models, such as those incorporating target or speed, improves decoding particularly when there is a small number of EMGs available.

Original languageEnglish (US)
Article number036002
JournalJournal of Neural Engineering
Volume9
Issue number3
DOIs
StatePublished - Jun 2012

Funding

ASJC Scopus subject areas

  • Cellular and Molecular Neuroscience
  • Biomedical Engineering

Fingerprint

Dive into the research topics of 'Decoding with limited neural data: A mixture of time-warped trajectory models for directional reaches'. Together they form a unique fingerprint.

Cite this