TY - GEN
T1 - Kinect-based 3D Human Motion Acquisition and Evaluation System for Remote Rehabilitation and Exercise
AU - You, Yu
AU - Wang, Tai Qi
AU - Osawa, Keisuke
AU - Shimodozono, Megumi
AU - Tanaka, Eiichiro
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - With the increasingly serious aging situation, more and more elderly people are physically disabled. In addition, the current rehabilitation resources have the problems of shortage and uneven distribution, coupled with the impact of COVID-19 in early 2019, most patients have been greatly restricted from going to the rehabilitation center for training. To solve these problems, we propose a "Kinect-based 3D Human Motion Acquisition and Evaluation System for Remote Rehabilitation and Exercise"which uses the Kinect3 camera to obtain human motion with an error rate of only 3% when the body is in front of the camera. Then we use Unity to create a humanoid virtual model and interactive scene and synchronize the real body motion to the virtual model with an average error less than 1%. At the same time, our system provides reliable and highly accurate methods for evaluating actions based on angles and trajectories. What's more, users don't need to wear any wearable devices when using the system. It is a mark-less motion acquisition system, which reduces the cost and improves the usability and scalability of the system. And the interactive virtual scenes also increase the training motivation of users.
AB - With the increasingly serious aging situation, more and more elderly people are physically disabled. In addition, the current rehabilitation resources have the problems of shortage and uneven distribution, coupled with the impact of COVID-19 in early 2019, most patients have been greatly restricted from going to the rehabilitation center for training. To solve these problems, we propose a "Kinect-based 3D Human Motion Acquisition and Evaluation System for Remote Rehabilitation and Exercise"which uses the Kinect3 camera to obtain human motion with an error rate of only 3% when the body is in front of the camera. Then we use Unity to create a humanoid virtual model and interactive scene and synchronize the real body motion to the virtual model with an average error less than 1%. At the same time, our system provides reliable and highly accurate methods for evaluating actions based on angles and trajectories. What's more, users don't need to wear any wearable devices when using the system. It is a mark-less motion acquisition system, which reduces the cost and improves the usability and scalability of the system. And the interactive virtual scenes also increase the training motivation of users.
UR - http://www.scopus.com/inward/record.url?scp=85137709130&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85137709130&partnerID=8YFLogxK
U2 - 10.1109/AIM52237.2022.9863318
DO - 10.1109/AIM52237.2022.9863318
M3 - Conference contribution
AN - SCOPUS:85137709130
T3 - IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM
SP - 1213
EP - 1218
BT - 2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2022
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2022 IEEE/ASME International Conference on Advanced Intelligent Mechatronics, AIM 2022
Y2 - 11 July 2022 through 15 July 2022
ER -