NRI: Electrosense imaging for underwater telepresence and manipulation

Project: Research project

Project Details

Description

Overview:
NRI: Electrosense imaging for underwater telepresence and manipulation;Michael Peshkin;
Northwestern
University Human telepresence and telemanipulation in unstructured underwater
environments is essential
for tasks such as security sweeps in harbors and oil field servicing.
Co-robotic solutions are needed, as the risks are great for human divers and autonomous robots
lack the capability to deal with unpredictable contingencies. A key challenge for underwater
human telepresence is providing the human with situation awareness, both for navigation and
manipulation. Two popular sensory modalities for ROVs are vision (short-range sensing) and
sonar (long-range sensing). Vision fails in murky environments, however, such as when mud
is kicked up from the bottom. A short-range alternative to visual line-of-sight is needed.
The solution proposed here is inspired by biological electrosense. Electrosense is used by
the weakly electric fish to navigate and hunt in murky water where vision is ineffective.
These fish generate an AC electric field that is perturbed by objects nearby. Electroreceptors
covering the body of the fish report the amplitude and phase of the local field. The animal
decodes electric field perturbations into geometric information about its surroundings. Electrosense
is fundamentally different from optical vision (and other imaging) that create projective
images of 3D space. Electrosense data demands new methods of preprocessing for human interpretation,
and new computed methods for machine interpretation. A Northwestern University team has been
studying electrosense in fish for several years, modeling it computationally, and pioneering
electronic equivalents. A parallel effort at the University of Washington, also inspired by
weakly electric fish, has focused on electrosense for manipulation. These two academic teams
will join with industrial collaborator HDT Robotics to make possible a new human experience
of underwater environments, and new machine learning capabilities supporting navigation and
manipulation.
Intellectual Merit :
The proposed efforts are organized into three thrusts: (1) electrosense imaging for operator
situation awareness in vehicle-scale positioning and navigation, (2) electrosense pretouch
to extract object shape and position at the manipulator scale during underwater grasping,
and (3) appropriate experimental testbeds. For Thrust 1, the team will develop a new instrument:
a kilopixel electrosense array. Electric images are not readily interpretable by humans, so
one goal is to develop data processing methods to facilitate human visual interpretation.
The team will also develop algorithms to extract specific features relevant to navigation
and odometry. For Thrust 2, the team will develop new methods to use electrosense for planning
and control of grasp in water. In contrast to the electrosense array, electrosense sensors
used as part of a grasper are sparse and non-coplanar. The team will employ optimal information
gathering and machine learning to gain relevant 3D geometric information during approach-to-grasp.
The industrial/academic collaboration provides a number of testbeds for human telepresence,
including a 4-axis gantry robot in a large water tank; a 10-DoF waterproof arm/hand robot
for grasping and manipulation; and a commercial underwater vehicle for reality-testing.
Broader Impacts :
A sensory modality other than optical vision is needed to improve situation awareness and
make grasping and manipulation tasks possible for operators of ROVs
StatusFinished
Effective start/end date9/1/148/31/19

Funding

  • National Science Foundation (IIS-1427419)

Fingerprint Explore the research topics touched on by this project. These labels are generated based on the underlying awards/grants. Together they form a unique fingerprint.