Toward augmenting everything: Detecting and tracking geometrical features on planar objects

Hideaki Uchiyama, Eric Marchand

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Citations (Scopus)

Abstract

This paper presents an approach for detecting and tracking various types of planar objects with geometrical features. We combine tra- ditional keypoint detectors with Locally Likely Arrangement Hash- ing (LLAH) [21] for geometrical feature based keypoint matching. Because the stability of keypoint extraction affects the accuracy of the keypoint matching, we set the criteria of keypoint selection on keypoint response and the distance between keypoints. In order to produce robustness to scale changes, we build a non-uniform im- age pyramid according to keypoint distribution at each scale. In the experiments, we evaluate the applicability of traditional keypoint detectors with LLAH for the detection. We also compare our ap- proach with SURF and finally demonstrate that it is possible to de- tect and track different types of textures including colorful pictures, binary fiducial markers and handwritings.

Original languageEnglish
Title of host publication2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011
Pages17-25
Number of pages9
DOIs
Publication statusPublished - 2011
Externally publishedYes
Event2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011 - Basel, Switzerland
Duration: Oct 26 2011Oct 29 2011

Publication series

Name2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011

Other

Other2011 10th IEEE International Symposium on Mixed and Augmented Reality, ISMAR 2011
Country/TerritorySwitzerland
CityBasel
Period10/26/1110/29/11

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Toward augmenting everything: Detecting and tracking geometrical features on planar objects'. Together they form a unique fingerprint.

Cite this