TY - JOUR
T1 - Real-time detection of the interaction between an upper-limb power-assist robot user and another person for perception-assist
AU - Chathuramali, K. G.Manosha
AU - Kiguchi, Kazuo
N1 - Publisher Copyright:
© 2020 Elsevier B.V.
PY - 2020/6
Y1 - 2020/6
N2 - Assisting aged population or a population with disabilities is a critical problem in today's world. To compensate for their declined motor function, power-assist wearable robots have been proposed. In some cases, however, the cognitive function of these populations may have declined as well, and it may be insufficient to compensate only for their motor function deficiency. Perception-assist wearable robots, which can perceive environmental information using visual sensors attached to them, have been proposed to address this problem. This study addresses the problem of identifying motion intentions of the user of an upper-limb power-assist wearable robot, while the user engages in desired interactions with others. It is important to consider both the interacting parties in order to accurately predict proper interaction. Therefore, this paper presents an interaction recognition methodology by combining both the user's motion intention and the other party's motion intention with environmental information. A fuzzy reasoning model is proposed to semantically combine the motion intentions of both parties and environmental information. In this method, the motion intentions of both the user and the other party are simultaneously estimated using kinematic information and visual information, respectively, and they are employed for predicting the interactions between both parties. The effectiveness of the proposed approach is experimentally evaluated.
AB - Assisting aged population or a population with disabilities is a critical problem in today's world. To compensate for their declined motor function, power-assist wearable robots have been proposed. In some cases, however, the cognitive function of these populations may have declined as well, and it may be insufficient to compensate only for their motor function deficiency. Perception-assist wearable robots, which can perceive environmental information using visual sensors attached to them, have been proposed to address this problem. This study addresses the problem of identifying motion intentions of the user of an upper-limb power-assist wearable robot, while the user engages in desired interactions with others. It is important to consider both the interacting parties in order to accurately predict proper interaction. Therefore, this paper presents an interaction recognition methodology by combining both the user's motion intention and the other party's motion intention with environmental information. A fuzzy reasoning model is proposed to semantically combine the motion intentions of both parties and environmental information. In this method, the motion intentions of both the user and the other party are simultaneously estimated using kinematic information and visual information, respectively, and they are employed for predicting the interactions between both parties. The effectiveness of the proposed approach is experimentally evaluated.
UR - http://www.scopus.com/inward/record.url?scp=85079047399&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85079047399&partnerID=8YFLogxK
U2 - 10.1016/j.cogsys.2020.01.002
DO - 10.1016/j.cogsys.2020.01.002
M3 - Article
AN - SCOPUS:85079047399
SN - 2214-4366
VL - 61
SP - 53
EP - 63
JO - Cognitive Systems Research
JF - Cognitive Systems Research
ER -