NRI: Autonomous Synthesis of Haptic Languages Haptic systems use both kinesthetic information and tactile information to construct representations of the environment based on contact. Compact ways of processing and storing that sensed data into formats that can be used for object identification and manipulation planning are still lacking. Moreover, experimental systems typically rely on databases of pre-recorded information and use pattern matching techniques to compare sensed data, in its original form, to libraries of previously encountered signals. The goal of the proposed work is to identify means by which an autonomous system can construct its own compact representation of tactile data for purposes of estimation (e.g., object identification) and control (e.g., manipulation planning). The proposed work will develop computational synthesis tools for mechanical systems that experience contact and sense the world through touch. The focus will be on exploration for language development, where a language is technically defined by the symbols found in the alphabet and the words the alphabet can produce; the expressivity of the language is then encoded in the entropy of the language. A key hypothesis in the proposed work is that an autonomous mechanical system can discover a language---in this case a language appropriate for haptic perception---by balancing energy, ergodicity, and entropy. A secondary but important hypothesis is that the resulting planning policy can be computed with reasonable computational complexity. The proposed work will ultimately include autonomous construction of the symbols of the alphabet directly from analog data, detection of the allowable words through ergodic search, and minimization of the language to a minimal or near-minimal language with the same computed entropy. The proposed work will have application to autonomous haptic sensing for robots, and it will also have application to the design of advanced haptic interfaces for people. With the advent of surface haptic technologies that enable multi-digit tactile interactions, there is a need for principled approaches to the development of expressive haptic languages. Through human trials, we will investigate the tradeoffs between the entropy and redundancy of a language, leading to a language synthesis process that is optimized with respect to richness of expression while remaining robust in terms of recognition. A state-of-the-art robot hand and arm will be used in studies of autonomous synthesis will support this research. The broad impacts for this work will include outreach, technology transfer to rehabilitation, the development of online courses in dynamics and analysis, and impact on technology for the visually impaired. The PIs are currently working with the Museum of Science and Industry, and as part of the proposed work the PIs and supported graduate students will participate in a National Robotics Week exhibit in the main rotunda of the museum with an estimated viewership of over one million visitors. Graduate students, undergraduates, and high school students involved in the PIs? laboratory will all be involved in the exhibit. The algorithms and software that are part of the proposed work are also in use at the Rehabilitation Institute of Chicago in ongoing research. Outcomes of this work will have immediate impact on those projects as well as haptic interface technology relevant to the visually impaired. Lastly, both PIs are involved in significant classroom innovations, and the proposed work will include complementary online courses
|Effective start/end date||8/1/14 → 7/31/19|
- National Science Foundation (IIS-1426961)
Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.