TY - GEN
T1 - First-person animal activity recognition from egocentric videos
AU - Iwashita, Yumi
AU - Takamine, Asamichi
AU - Kurazume, Ryo
AU - Ryoo, M. S.
N1 - Publisher Copyright:
© 2014 IEEE.
PY - 2014/12/4
Y1 - 2014/12/4
N2 - This paper introduces the concept of first-person animal activity recognition, the problem of recognizing activities from a view-point of an animal (e.g., a dog). Similar to first-person activity recognition scenarios where humans wear cameras, our approach estimates activities performed by an animal wearing a camera. This enables monitoring and understanding of natural animal behaviors even when there are no people around them. Its applications include automated logging of animal behaviors for medical/biology experiments, monitoring of pets, and investigation of wildlife patterns. In this paper, we construct a new dataset composed of first-person animal videos obtained by mounting a camera on each of the four pet dogs. Our new dataset consists of 10 activities containing a heavy/fair amount of ego-motion. We implemented multiple baseline approaches to recognize activities from such videos while utilizing multiple types of global/local motion features. Animal ego-actions as well as human-animal interactions are recognized with the baseline approaches, and we discuss experimental results.
AB - This paper introduces the concept of first-person animal activity recognition, the problem of recognizing activities from a view-point of an animal (e.g., a dog). Similar to first-person activity recognition scenarios where humans wear cameras, our approach estimates activities performed by an animal wearing a camera. This enables monitoring and understanding of natural animal behaviors even when there are no people around them. Its applications include automated logging of animal behaviors for medical/biology experiments, monitoring of pets, and investigation of wildlife patterns. In this paper, we construct a new dataset composed of first-person animal videos obtained by mounting a camera on each of the four pet dogs. Our new dataset consists of 10 activities containing a heavy/fair amount of ego-motion. We implemented multiple baseline approaches to recognize activities from such videos while utilizing multiple types of global/local motion features. Animal ego-actions as well as human-animal interactions are recognized with the baseline approaches, and we discuss experimental results.
UR - http://www.scopus.com/inward/record.url?scp=84919904410&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84919904410&partnerID=8YFLogxK
U2 - 10.1109/ICPR.2014.739
DO - 10.1109/ICPR.2014.739
M3 - Conference contribution
AN - SCOPUS:84919904410
T3 - Proceedings - International Conference on Pattern Recognition
SP - 4310
EP - 4315
BT - Proceedings - International Conference on Pattern Recognition
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 22nd International Conference on Pattern Recognition, ICPR 2014
Y2 - 24 August 2014 through 28 August 2014
ER -