Autonomous Visual Rendering using Physical Motion

Ahalya Prabhakar*, Anastasia Mavrommati, Jarvis Schultz, Todd D. Murphey

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

This paper addresses the problem of enabling a robot to represent and recreate visual information through physical motion, focusing on drawing using pens, brushes, or other tools. This work uses ergodicity as a control objective that translates planar visual input to physical motion without preprocessing (e.g., image processing, motion primitives). We achieve comparable results to existing drawing methods, while reducing the algorithmic complexity of the software. We demonstrate that optimal ergodic control algorithms with different time-horizon characteristics (infinitesimal, finite, and receding horizon) can generate qualitatively and stylistically different motions that render a wide range of visual information (e.g., letters, portraits, landscapes). In addition, we show that ergodic control enables the same software design to apply to multiple robotic systems by incorporating their particular dynamics, thereby reducing the dependence on task-specific robots. Finally, we demonstrate physical drawings with the Baxter robot.

Original languageEnglish (US)
Title of host publicationSpringer Proceedings in Advanced Robotics
PublisherSpringer Science and Business Media B.V.
Pages80-95
Number of pages16
DOIs
StatePublished - 2020

Publication series

NameSpringer Proceedings in Advanced Robotics
Volume13
ISSN (Print)2511-1256
ISSN (Electronic)2511-1264

Keywords

  • Automation
  • Motion control
  • Robot art

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Mechanical Engineering
  • Engineering (miscellaneous)
  • Artificial Intelligence
  • Computer Science Applications
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Autonomous Visual Rendering using Physical Motion'. Together they form a unique fingerprint.

Cite this