A Novel Algorithm for Vehicle Detection and Tracking in Airborne Videos

Mohamed A. Abdelwahab, Moataz M. Abdelwahab

Research output: Chapter in Book/Report/Conference proceedingConference contribution

14 Citations (Scopus)

Abstract

Real time detection and tracking of multi vehicles in airborne videos is still a challenging problem due to the camera motion and low resolution. In this paper, a real time technique for simultaneously detecting, tracking and counting vehicles in airborne and stationary camera videos is proposed. First, feature points are extracted and tracked through video frames. A new strategy is used for removing the non-stationary background points by measuring the changes in the histogram of the pixels around each feature point with time. The obtained foreground features are clustered and grouped into separate trackable vehicles based on their motion properties. Experimental results performed on videos representing airborne and fixed cameras confirm the excellent properties of the proposed algorithm.

Original languageEnglish
Title of host publicationProceedings - 2015 IEEE International Symposium on Multimedia, ISM 2015
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages65-68
Number of pages4
ISBN (Electronic)9781509003792
DOIs
Publication statusPublished - Mar 25 2016
Externally publishedYes
Event17th IEEE International Symposium on Multimedia, ISM 2015 - Miami, United States
Duration: Dec 14 2015Dec 16 2015

Publication series

NameProceedings - 2015 IEEE International Symposium on Multimedia, ISM 2015

Conference

Conference17th IEEE International Symposium on Multimedia, ISM 2015
Country/TerritoryUnited States
CityMiami
Period12/14/1512/16/15

All Science Journal Classification (ASJC) codes

  • Computer Science Applications
  • Hardware and Architecture
  • Software
  • Computer Networks and Communications

Fingerprint

Dive into the research topics of 'A Novel Algorithm for Vehicle Detection and Tracking in Airborne Videos'. Together they form a unique fingerprint.

Cite this