Project Details
Description
Overview: In many critical scenarios of object manipulation such as writing or surgery, one must concurrently control and sense position and force. This is important for perception, such as when assessing the stiffness of an object, and for action, such as when manipulating objects and controlling grip forces that help prevent their slippage. There are two kinds of force sensing modalities in our body – kinesthetic and tactile. For example, when cutting with a scalpel, kinesthetic force information is sensed via the tendons, and tactile information is sensed via the skin at the interface with the scalpel. In robotics, separation strategies are often used to simplify control. We hypothesize that in contrast to robotics, humans integrate kinesthetic and tactile information for perception, action, and learning, and that they use optimal estimators such as a Bayesian mixture model in this integration. In this project, we will develop and experimentally validate models of kinesthetic and tactile information integration in (I) perception, (II) control of manipulation and grip forces, and (III) motor learning. To achieve this goal, we will use computational modeling and recently developed programmable devices for tactile stimulation to selectively perturb the tactile and kinesthetic information channels to break the natural coupling and congruence between them in healthy individuals and stroke survivors. We assert that understanding these processes will impact three domains: (1) Neuroscience -understanding our ability to graciously manipulate objects with our hands; (2) Technology - presenting force information to users of robotic devices, such as in teleoperation, robot-assisted surgery, and surgical simulation; (3) Biomedicine and neurorehabilitation - for developing intelligent controllers for robotic prostheses and rehabilitation of dexterous manipulation skills following stroke and other neuromotor disorders.
Intellectual merit: The main objective of this project is to develop an experimentally validated model for integration of kinesthetic (position and force) and tactile (skin-stretch) information in object manipulation and tool-mediated interactions. In Aim 1, we will study combination of kinesthetic and tactile information in perception, manipulation and grip force control by injecting noise, delay, and rotation between he different sensory channels to break their congruency. In Aim 2, we will focus on adaptation of manipulation and grip forces in response to deterministic and random perturbations in the tactile and kinesthetic channels. To understand how the integration depends on intact brain structures, we will perform our investigation healthy participants and in chronic stroke survivors. This study is expected to advance our understanding of perception, action, and cognition by elucidating how force information is processed to form adaptive internal representations of the world. The findings of this project will be applicable in developing bioinspired intelligent controllers for robotic hands, and in providing users with the sense of touch in physical human-robot interactions such as in robot-assisted surgery, assistive devices, and neuroprosthetics. In addition, we will advance the understanding of how sensorimotor processing is impaired by stroke and other disabling conditions. Long-term, the results of this study may drastically improve the quality of life of patients undergoing surgery, stroke survivors treated with robotic rehabilitation devices, and individuals using robotic prostheses.
Broader Impacts: The broader imp
Status | Finished |
---|---|
Effective start/end date | 9/15/16 → 8/31/20 |
Funding
- Rehabilitation Institute of Chicago (81464/CL5296 // 1632259)
- National Science Foundation (81464/CL5296 // 1632259)
Fingerprint
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.