Clickable augmented documents

Sandy Martedi, Hideaki Uchiyama, Hideo Saito

Research output: Chapter in Book/Report/Conference proceedingConference contribution

5 Citations (Scopus)

Abstract

This paper presents an Augmented Reality (AR) system for physical text documents that enable users to click a document. In the system, we track the relative pose between a camera and a document to overlay some virtual contents on the document continuously. In addition, we compute the trajectory of a fingertip based on skin color detection for clicking interaction. By merging a document tracking and an interaction technique, we have developed a novel tangible document system. As an application, we develop an AR dictionary system that overlays the meaning and explanation of words by clicking on a document. In the experiment part, we present the accuracy of the clicking interaction and the robustness of our document tracking method against the occlusion.

Original languageEnglish
Title of host publication2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010
Pages162-166
Number of pages5
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010 - Saint Malo, France
Duration: Oct 4 2010Oct 6 2010

Publication series

Name2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010

Other

Other2010 IEEE International Workshop on Multimedia Signal Processing, MMSP2010
Country/TerritoryFrance
CitySaint Malo
Period10/4/1010/6/10

All Science Journal Classification (ASJC) codes

  • Computer Graphics and Computer-Aided Design
  • Human-Computer Interaction
  • Signal Processing

Fingerprint

Dive into the research topics of 'Clickable augmented documents'. Together they form a unique fingerprint.

Cite this