Rapid adaptation of brain–computer interfaces to new neuronal ensembles or participants via generative modelling

Shixian Wen*, Allen Yin, Tommaso Furlanello, M. G. Perich, L. E. Miller, Laurent Itti

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

For brain–computer interfaces (BCIs), obtaining sufficient training data for algorithms that map neural signals onto actions can be difficult, expensive or even impossible. Here we report the development and use of a generative model—a model that synthesizes a virtually unlimited number of new data distributions from a learned data distribution—that learns mappings between hand kinematics and the associated neural spike trains. The generative spike-train synthesizer is trained on data from one recording session with a monkey performing a reaching task and can be rapidly adapted to new sessions or monkeys by using limited additional neural data. We show that the model can be adapted to synthesize new spike trains, accelerating the training and improving the generalization of BCI decoders. The approach is fully data-driven, and hence, applicable to applications of BCIs beyond motor control.

Original languageEnglish (US)
JournalNature Biomedical Engineering
DOIs
StateAccepted/In press - 2021

ASJC Scopus subject areas

  • Biotechnology
  • Bioengineering
  • Medicine (miscellaneous)
  • Biomedical Engineering
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Rapid adaptation of brain–computer interfaces to new neuronal ensembles or participants via generative modelling'. Together they form a unique fingerprint.

Cite this