Integrating Human and Machine Learning for Enabling Co-Adaptive Body-Machine Interfaces

Project: Research project

Project Details

Description

After suffering an injury to the spinal cord, paralyzed survivors need to take advantage of their residual mobility to recover independence through the operation of a wheelchair and other assistive devices. In this endeavor, their challenges are analogous to those faced by a surgeon learning to perform laparoscopy with a Da Vinci system. Common to these different situations is the necessity to learn a novel motor skill and to form a novel competence on the relation between movements and their sensed consequences. Here, we consider how this challenge occurs through and may be facilitated by an adaptive interface, a body-machine interface (BoMI) that maps signals derived from body motions onto commands to an external device. This project addresses the key objective of interface customization by an adaptive approach based on a collaborative interaction between human and machine learning. The working hypotheses on which this project is based are H1) that through a dynamical process of learning the human operators form an internal model of the map connecting their body motions to the actions performed by the device, and H2) that the human machine interface can be adapted to its users by an update rule that tracks the learning process of the human operator. The success of the research project will lead to a new family of intelligent human machine interfaces capable to "customize themselves" to the evolving ability of their users. Intellectual Merit The transformative character of the proposal results from the synergy between the customization of the interface based on machine learning approaches, and the rigorous modeling and experimental analysis of human motor learning both in linear and nonlinear frameworks. In a BoMI, signals generated by the user's residual body motions are mapped onto a lower dimensional latent manifold of commands for an external device. As this mapping is established, the user's task is to produce body motions that guide the device towards achieving set goals. This implies solving the ill-posed inverse problem of mapping a low-dimensional desired device behavior to a higher dimensional body configuration. Accordingly, the first specific objective is to characterize the human learning as a dynamical process through which each user forms an inverse model of the BoMI mapping. The objective will be pursued by comparing a state-based model of human learning to empirical data obtained from both unimpaired and tetraplegic subjects. The second specific objective is to develop machine learning rules and schedules for updating the BoMI map based on monitored characteristics of the human learning dynamics. The study will focus on matching the interface to the evolving internal model developed by the machine’s user. A key element of the project is the use of a family of machine learning techniques, autoencoder networks (AENs), that both offer a unified framework for a large class of linear and nonlinear dimensionality reduction methods, and provide a representation for coadaptive learning. It is expected that the success of this project will result in a new family of low-cost non-invasive interfaces based on capturing the dynamics of human learning; the concept will be applicable to a broad domain of human machine interactions beyond assistive devices. Broader Impacts The broader impacts of the proposed work are with respect to a) the development of a novel framework for human-machine interactions, with a broad spectrum of applications, b) the improvement of the quality of life of disabled people whose mobility has been re
StatusActive
Effective start/end date9/1/218/31/22

Funding

  • Rehabilitation Institute of Chicago (Subaward 9531//2054406)
  • National Science Foundation (Subaward 9531//2054406)

Fingerprint

Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.