Global localization for a mobile robot using laser reflectance and particle filter

Dong Xiang Zhang, Ryo Kurazume

Research output: Contribution to journalArticlepeer-review


Global localization is a fundamental requirement for a mobile robot. Map-based global localization is a popular technique and gives a precise position by comparing a provided geometric map and current sensory data. However, it is quite time-consuming if 3D range data is processed for 6D global localization. On the other hand, appearance-based global localization using a captured image and recorded images is simple and suitable for real-time processing. However, this technique does not work in the dark or in an environment in which the lighting conditions change remarkably. To cope with these problems, we have proposed a two-step strategy which combines map-based global localization and appearance-based global localization. Firstly, several candidate positions are selected according to an appearance-based technique, and then the optimum position is determined by a map-based technique. Instead of camera images, we use reflectance images, which are captured by a laser range finder as a by-product of range sensing. In this paper, a new technique based on this global localization technique is proposed by combining the two step algorithm and a sampling-based approach. To cope with the odometry data, a particle filter is adopted for tracking robot positions. The effectiveness of the proposed technique is demonstrated through experiments in real environments.

Original languageEnglish
Pages (from-to)9-16
Number of pages8
JournalResearch Reports on Information Science and Electrical Engineering of Kyushu University
Issue number1
Publication statusPublished - May 2012
Externally publishedYes

All Science Journal Classification (ASJC) codes

  • Electrical and Electronic Engineering
  • Computer Science(all)


Dive into the research topics of 'Global localization for a mobile robot using laser reflectance and particle filter'. Together they form a unique fingerprint.

Cite this