Single color one-shot scan using topology information

Hiroshi Kawasaki, Hitoshi Masuyama, Ryusuke Sagawa, Ryo Furukawa

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)


In this paper, we propose a new technique to achieve one-shot scan using single color and static pattern projector; such a method is ideal for acquisition of a moving object. Since a projector-camera systems generally have uncertainties on retrieving correspondences between the captured image and the projected pattern, many solutions have been proposed. Especially for one-shot scan, which means that only a single image is used for reconstruction, positional information of a pixel on the projected pattern should be encoded by spatial and/or color information. Although color information is frequently used for encoding, it is severely affected by texture and material of the object. In this paper, we propose a technique to solve the problem by using topological information instead of colors. Our technique successfully realizes one-shot scan with monochrome pattern.

Original languageEnglish
Title of host publicationComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings
PublisherSpringer Verlag
Number of pages10
EditionPART 3
ISBN (Print)9783642338847
Publication statusPublished - 2012
Externally publishedYes
EventComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings - Florence, Italy
Duration: Oct 7 2012Oct 13 2012

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 3
Volume7585 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349


ConferenceComputer Vision, ECCV 2012 - Workshops and Demonstrations, Proceedings

All Science Journal Classification (ASJC) codes

  • Theoretical Computer Science
  • General Computer Science


Dive into the research topics of 'Single color one-shot scan using topology information'. Together they form a unique fingerprint.

Cite this