Motion generation by learning relationship between object shapes and human motions

Tokuo Tsuji, Sho Tajima, Yosuke Suzuki, Tetsuyou Watanabe, Shoko Miyauchi, Kenichi Morooka, Kensuke Harada, Hiroaki Seki

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper presents a method for planning a robot motion of daily tasks by learning the relationship between object shapes and human motions. Robots are required to be able to deal with multifarious objects in various categories. However, it is difficult for robots to plan motions automatically for performing a task because objects even in the same category have different shapes. In our method, the motions are estimated by learning the relationship between object shapes and human motions using linear regression analysis.

Original languageEnglish
Pages (from-to)332-335
Number of pages4
JournalProceedings of International Conference on Artificial Life and Robotics
Volume2021
Publication statusPublished - 2021
Event26th International Conference on Artificial Life and Robotics, ICAROB 2021 - Beppu, Oita, Japan
Duration: Jan 21 2021Jan 24 2021

All Science Journal Classification (ASJC) codes

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Hardware and Architecture
  • Information Systems
  • Control and Systems Engineering
  • Electrical and Electronic Engineering
  • Modelling and Simulation

Fingerprint

Dive into the research topics of 'Motion generation by learning relationship between object shapes and human motions'. Together they form a unique fingerprint.

Cite this