Lost and found: Identifying objects in long-term surveillance videos

Citation

Saemi, Mohamad Mahdi and See, John Su Yang and Tan, Suyin (2016) Lost and found: Identifying objects in long-term surveillance videos. In: 2015 IEEE International Conference on Signal and Image Processing Applications (ICSIPA). IEEE, pp. 99-104. ISBN 978-1-4799-8996-6

[img] Text
78.pdf
Restricted to Repository staff only

Download (1MB)

Abstract

What good are surveillance videos without knowing what objects are there? Object classification has been actively researched for images and more recently, for videos, but not in the long-term sense. Videos that span a long period of time has its arduous challenges in such a task. This paper intends to bridge that gap by exploring object classification in long-term surveillance videos. In this work, we introduce a complete framework for processing long-term surveillance videos with the aim of classifying moving objects into five distinct classes commonly found in these scenes. With effective extraction of moving objects and track creation, object features are then encoded in a bag-of-words model before performing classification. Extensive experiments were conducted on a selected portion of the recent LOST dataset. With state-of-the-art PHOW features, we are able to achieve the highest accuracy of around 92% using a track-based classification scheme that is robust against potential frame-level misclassifications.

Item Type: Book Section
Uncontrolled Keywords: Feature extraction, Surveillance, Videos, Histograms, Distortion, Object detection, Tracking
Subjects: T Technology > TA Engineering (General). Civil engineering (General) > TA1501-1820 Applied optics. Photonics
Divisions: Faculty of Computing and Informatics (FCI)
Depositing User: Ms Rosnani Abd Wahab
Date Deposited: 08 Dec 2017 15:19
Last Modified: 08 Dec 2017 15:19
URII: http://shdl.mmu.edu.my/id/eprint/6592

Downloads

Downloads per month over past year

View ItemEdit (login required)