TY - GEN
T1 - Multi-stage activity inference for locomotion and transportation analytics of mobile users
AU - Nakamura, Yugo
AU - Umetsu, Yoshinori
AU - Talusan, Jose Paolo
AU - Yasumoto, Keiichi
AU - Sasaki, Wataru
AU - Takata, Masashi
AU - Arakawa, Yutaka
N1 - Publisher Copyright:
© 2018 Association for Computing Machinery
PY - 2018/10/8
Y1 - 2018/10/8
N2 - In this paper, we, Ubi-NUTS Japan, introduce a multi-stage activity inference method that can recognize a user's mode of locomotion and transportation using mobile device sensors. We use the Sussex-Huawei Locomotion-Transportation (SHL) dataset to tackle the SHL recognition challenge, where the goal is to recognize 8 modes of locomotion and transportation (still, walk, run, bike, car, bus, train, and subway) activities from the inertial sensor data of a smartphone. We adopt a multi-stage approach where the 8 class classification problem is divided into multiple sub-problems considering the similarity of each activity. Multimodal sensor data collected from a mobile phone are inferred using a proposed pipeline that combines feature extraction and 4 different types of classifiers generated using the random forest algorithm. We evaluated our method using data from over 271 hours of daily activities of 1 participant and the 5-fold cross-validation. Evaluation results demonstrate that our method clearly recognizes the 8 types of activities with an average F1-score of 97%.
AB - In this paper, we, Ubi-NUTS Japan, introduce a multi-stage activity inference method that can recognize a user's mode of locomotion and transportation using mobile device sensors. We use the Sussex-Huawei Locomotion-Transportation (SHL) dataset to tackle the SHL recognition challenge, where the goal is to recognize 8 modes of locomotion and transportation (still, walk, run, bike, car, bus, train, and subway) activities from the inertial sensor data of a smartphone. We adopt a multi-stage approach where the 8 class classification problem is divided into multiple sub-problems considering the similarity of each activity. Multimodal sensor data collected from a mobile phone are inferred using a proposed pipeline that combines feature extraction and 4 different types of classifiers generated using the random forest algorithm. We evaluated our method using data from over 271 hours of daily activities of 1 participant and the 5-fold cross-validation. Evaluation results demonstrate that our method clearly recognizes the 8 types of activities with an average F1-score of 97%.
KW - Activity Recognition
KW - Mobile Computing
KW - Mode of locomotion and Transportation Inference
UR - http://www.scopus.com/inward/record.url?scp=85058303103&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85058303103&partnerID=8YFLogxK
U2 - 10.1145/3267305.3267526
DO - 10.1145/3267305.3267526
M3 - Conference contribution
AN - SCOPUS:85058303103
T3 - UbiComp/ISWC 2018 - Adjunct Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers
SP - 1579
EP - 1588
BT - UbiComp/ISWC 2018 - Adjunct Proceedings of the 2018 ACM International Joint Conference on Pervasive and Ubiquitous Computing and Proceedings of the 2018 ACM International Symposium on Wearable Computers
PB - Association for Computing Machinery, Inc
T2 - 2018 Joint ACM International Conference on Pervasive and Ubiquitous Computing, UbiComp 2018 and 2018 ACM International Symposium on Wearable Computers, ISWC 2018
Y2 - 8 October 2018 through 12 October 2018
ER -