First-person animal activity recognition from egocentric videos

Yumi Iwashita, Asamichi Takamine, Ryo Kurazume, M. S. Ryoo

Research output: Chapter in Book/Report/Conference proceedingConference contribution

55 Citations (Scopus)


This paper introduces the concept of first-person animal activity recognition, the problem of recognizing activities from a view-point of an animal (e.g., a dog). Similar to first-person activity recognition scenarios where humans wear cameras, our approach estimates activities performed by an animal wearing a camera. This enables monitoring and understanding of natural animal behaviors even when there are no people around them. Its applications include automated logging of animal behaviors for medical/biology experiments, monitoring of pets, and investigation of wildlife patterns. In this paper, we construct a new dataset composed of first-person animal videos obtained by mounting a camera on each of the four pet dogs. Our new dataset consists of 10 activities containing a heavy/fair amount of ego-motion. We implemented multiple baseline approaches to recognize activities from such videos while utilizing multiple types of global/local motion features. Animal ego-actions as well as human-animal interactions are recognized with the baseline approaches, and we discuss experimental results.

Original languageEnglish
Title of host publicationProceedings - International Conference on Pattern Recognition
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages6
ISBN (Electronic)9781479952083
Publication statusPublished - Dec 4 2014
Event22nd International Conference on Pattern Recognition, ICPR 2014 - Stockholm, Sweden
Duration: Aug 24 2014Aug 28 2014

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651


Other22nd International Conference on Pattern Recognition, ICPR 2014

All Science Journal Classification (ASJC) codes

  • Computer Vision and Pattern Recognition


Dive into the research topics of 'First-person animal activity recognition from egocentric videos'. Together they form a unique fingerprint.

Cite this