Gait-based person identification using 3D LiDAR and long short-term memory deep networks

Hiroyuki Yamada, Jeongho Ahn, Oscar Martinez Mozos, Yumi Iwashita, Ryo Kurazume

Research output: Contribution to journalArticlepeer-review

13 Citations (Scopus)


Gait recognition is one measure of biometrics, which also includes facial, fingerprint, and retina recognition. Although most biometric methods require direct contact between a device and a subject, gait recognition has unique characteristics whereby interaction with the subjects is not required and can be performed from a distance. Cameras are commonly used for gait recognition, and a number of researchers have used depth information obtained using an RGB-D camera, such as the Microsoft Kinect. Although depth-based gait recognition has advantages, such as robustness against light conditions or appearance variations, there are also limitations. For instance, the RGB-D camera cannot be used outdoors and the measurement distance is limited to approximately 10 meters. The present paper describes a long short-term memory-based method for gait recognition using a real-time multi-line LiDAR. Very few studies have dealt with LiDAR-based gait recognition, and the present study is the first attempt that combines LiDAR data and long short-term memory for gait recognition and focuses on dealing with different appearances. We collect the first gait recognition dataset that consists of time-series range data for 30 people with clothing variations and show the effectiveness of the proposed approach.

Original languageEnglish
Pages (from-to)1-11
Number of pages11
JournalAdvanced Robotics
Publication statusPublished - 2020

All Science Journal Classification (ASJC) codes

  • Software
  • Control and Systems Engineering
  • Human-Computer Interaction
  • Hardware and Architecture
  • Computer Science Applications


Dive into the research topics of 'Gait-based person identification using 3D LiDAR and long short-term memory deep networks'. Together they form a unique fingerprint.

Cite this