TY - GEN
T1 - Texture-less planar object detection and pose estimation using Depth-Assisted Rectification of Contours
AU - Lima, João Paulo
AU - Uchiyama, Hideaki
AU - Teichrieb, Veronica
AU - Marchand, Eric
PY - 2012
Y1 - 2012
N2 - This paper presents a method named Depth-Assisted Rectification of Contours (DARC) for detection and pose estimation of texture-less planar objects using RGB-D cameras. It consists in matching contours extracted from the current image to previously acquired template contours. In order to achieve invariance to rotation, scale and perspective distortions, a rectified representation of the contours is obtained using the available depth information. DARC requires only a single RGB-D image of the planar objects in order to estimate their pose, opposed to some existing approaches that need to capture a number of views of the target object. It also does not require to generate warped versions of the templates, which is commonly needed by existing object detection techniques. It is shown that the DARC method runs in real-time and its detection and pose estimation quality are suitable for augmented reality applications.
AB - This paper presents a method named Depth-Assisted Rectification of Contours (DARC) for detection and pose estimation of texture-less planar objects using RGB-D cameras. It consists in matching contours extracted from the current image to previously acquired template contours. In order to achieve invariance to rotation, scale and perspective distortions, a rectified representation of the contours is obtained using the available depth information. DARC requires only a single RGB-D image of the planar objects in order to estimate their pose, opposed to some existing approaches that need to capture a number of views of the target object. It also does not require to generate warped versions of the templates, which is commonly needed by existing object detection techniques. It is shown that the DARC method runs in real-time and its detection and pose estimation quality are suitable for augmented reality applications.
UR - http://www.scopus.com/inward/record.url?scp=84873529334&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84873529334&partnerID=8YFLogxK
U2 - 10.1109/ISMAR.2012.6402582
DO - 10.1109/ISMAR.2012.6402582
M3 - Conference contribution
AN - SCOPUS:84873529334
SN - 9781467346603
T3 - ISMAR 2012 - 11th IEEE International Symposium on Mixed and Augmented Reality 2012, Science and Technology Papers
SP - 297
EP - 298
BT - ISMAR 2012 - 11th IEEE International Symposium on Mixed and Augmented Reality 2012, Science and Technology Papers
T2 - 11th IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2012
Y2 - 5 November 2012 through 8 November 2012
ER -