Project: Research project

Project Details


The proposed work will develop methods for robots to actively collect data to build up general-purpose machine learning models of sensor-object relationships. Using neural networks and auto-encoders as a machine learning setting, we will synthesize motion that will maximize the learning utility of each measurement, enabling an agent to both generalize and improve models for future use in search and identification. The goal is to enable an autonomous system to use sensors with unknown physics to identify objects of unknown geometry and composition in challenging envi-ronments under data collection time constraints. Moreover, these techniques will enable autonomy to actively maintain its learning representations as operational conditions change. The goal of this work is to enable robots to quickly collect data under severe time constraints in unforgiving environments to improve learning models created in benign environments with un-limited time and compute. This work will generate active learning algorithms to automate data collection to improve classification of previously unknown objects using arbitrary sensors in envi-ronments with unspecified dynamics. For instance, electrosense sensors can use electromagnetic fields to detect and identify objects under water, but for an object of unknown geometry and ma-terial properties, in water that may have fluid dynamics and nonhomogeneous composition, the sensor-object relationship will generally be unknown and not amenable to first-principle analysis. This work will enable both these novel sensing modalities as well as traditional sensors (such as vision)—and their fusion—while capitalizing on model-based control of the autonomous agent that moves the sensor through the environment.
Effective start/end date7/19/217/18/24


  • Office of Naval Research (N00014-21-1-2706 P00002)


Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.