Flying triangulation - A motion-robust optical 3D sensor for the real-time shape acquisition of complex objects

Florian Willomitzer, Svenja Ettl, Oliver Arold, Gerd Häusler

Research output: Chapter in Book/Report/Conference proceedingConference contribution

6 Scopus citations

Abstract

The three-dimensional shape acquisition of objects has become more and more important in the last years. Up to now, there are several well-established methods which already yield impressive results. However, even under quite common conditions like object movement or a complex shaping, most methods become unsatisfying. Thus, the 3D shape acquisition is still a difficult and non-trivial task. We present our measurement principle "Flying Triangulation" which enables a motion-robust 3D acquisition of complex-shaped object surfaces by a freely movable handheld sensor. Since "Flying Triangulation" is scalable, a whole sensor-zoo for different object sizes is presented. Concluding, an overview of current and future fields of investigation is given.

Original languageEnglish (US)
Title of host publication3rd International Topical Meeting on Optical Sensing and Artificial Vision, OSAV 2012
Pages19-26
Number of pages8
DOIs
StatePublished - 2013
Event3rd International Topical Meeting on Optical Sensing and Artificial Vision, OSAV 2012 - Saint Petersburg, Russian Federation
Duration: May 14 2012May 17 2012

Publication series

NameAIP Conference Proceedings
Volume1537
ISSN (Print)0094-243X
ISSN (Electronic)1551-7616

Conference

Conference3rd International Topical Meeting on Optical Sensing and Artificial Vision, OSAV 2012
CountryRussian Federation
CitySaint Petersburg
Period5/14/125/17/12

Keywords

  • 3D Metrology
  • Hand Guide
  • Optical Sensor
  • Real Time
  • Triangulation

ASJC Scopus subject areas

  • Physics and Astronomy(all)

Fingerprint Dive into the research topics of 'Flying triangulation - A motion-robust optical 3D sensor for the real-time shape acquisition of complex objects'. Together they form a unique fingerprint.

Cite this