TY - JOUR
T1 - Distributed proprioception of 3d configuration in soft, sensorized robots via deep learning
AU - Truby, Ryan L.
AU - Santina, Cosimo Della
AU - Rus, Daniela
N1 - Funding Information:
Manuscript received September 10, 2019; accepted January 28, 2020. Date of publication February 26, 2020; date of current version March 9, 2020. This letter was recommended for publication by Associate Editor M. Cianchetti and Kyu-Jin Cho upon evaluation of the reviewers’ comments. This work was supported by the NSF EFRI Program Grant 1830901. The work of R. L. Truby was supported by the Schmidt Science Fellows program, in partnership with the Rhodes Trust. (Ryan L. Truby and Cosimo Della Santina contributed equally to this work.) (Corresponding author: Ryan L. Truby.) The authors are with the MIT Computer Science and Artificial Intelligence Laboratory, Massachusetts Institute of Technology, Cambridge, MA 02139 USA (e-mail: rltruby@mit.edu; cosimodellasantina@gmail.com; rus@csail.mit.edu).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/4
Y1 - 2020/4
N2 - Creating soft robots with sophisticated, autonomous capabilities requires these systems to possess reliable, on-line proprioception of 3D configuration through integrated soft sensors. We present a framework for predicting a soft robot's 3D configuration via deep learning using feedback from a soft, proprioceptive sensor skin. Our framework introduces a kirigami-enabled strategy for rapidly sensorizing soft robots using off-the-shelf materials, a general kinematic description for soft robot geometry, and an investigation of neural network designs for predicting soft robot configuration. Even with hysteretic, non-monotonic feedback from the piezoresistive sensors, recurrent neural networks show potential for predicting our new kinematic parameters and, thus, the robot's configuration. One trained neural network closely predicts steady-state configuration during operation, though complete dynamic behavior is not fully captured. We validate our methods on a trunk-like arm with 12 discrete actuators and 12 proprioceptive sensors. As an essential advance in soft robotic perception, we anticipate our framework will open new avenues towards closed loop control in soft robotics.
AB - Creating soft robots with sophisticated, autonomous capabilities requires these systems to possess reliable, on-line proprioception of 3D configuration through integrated soft sensors. We present a framework for predicting a soft robot's 3D configuration via deep learning using feedback from a soft, proprioceptive sensor skin. Our framework introduces a kirigami-enabled strategy for rapidly sensorizing soft robots using off-the-shelf materials, a general kinematic description for soft robot geometry, and an investigation of neural network designs for predicting soft robot configuration. Even with hysteretic, non-monotonic feedback from the piezoresistive sensors, recurrent neural networks show potential for predicting our new kinematic parameters and, thus, the robot's configuration. One trained neural network closely predicts steady-state configuration during operation, though complete dynamic behavior is not fully captured. We validate our methods on a trunk-like arm with 12 discrete actuators and 12 proprioceptive sensors. As an essential advance in soft robotic perception, we anticipate our framework will open new avenues towards closed loop control in soft robotics.
KW - Modeling
KW - and learning for soft robots
KW - control
KW - deep learning in robotics and automation
KW - soft sensors and actuators
UR - http://www.scopus.com/inward/record.url?scp=85081735197&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85081735197&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.2976320
DO - 10.1109/LRA.2020.2976320
M3 - Article
AN - SCOPUS:85081735197
SN - 2377-3766
VL - 5
SP - 3299
EP - 3306
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
M1 - 9013033
ER -