Online Least Squares One-Class Support Vector Machines-Based Abnormal Visual Event Detection

The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-...

Full description

Saved in:
Bibliographic Details
Published inSensors (Basel, Switzerland) Vol. 13; no. 12; pp. 17130 - 17155
Main Authors Wang, Tian, Chen, Jie, Zhou, Yi, Snoussi, Hichem
Format Journal Article
LanguageEnglish
Published Switzerland MDPI AG 12.12.2013
Molecular Diversity Preservation International (MDPI)
Subjects
Online AccessGet full text
ISSN1424-8220
1424-8220
DOI10.3390/s131217130

Cover

More Information
Summary:The abnormal event detection problem is an important subject in real-time video surveillance. In this paper, we propose a novel online one-class classification algorithm, online least squares one-class support vector machine (online LS-OC-SVM), combined with its sparsified version (sparse online LS-OC-SVM). LS-OC-SVM extracts a hyperplane as an optimal description of training objects in a regularized least squares sense. The online LS-OC-SVM learns a training set with a limited number of samples to provide a basic normal model, then updates the model through remaining data. In the sparse online scheme, the model complexity is controlled by the coherence criterion. The online LS-OC-SVM is adopted to handle the abnormal event detection problem. Each frame of the video is characterized by the covariance matrix descriptor encoding the moving information, then is classified into a normal or an abnormal frame. Experiments are conducted, on a two-dimensional synthetic distribution dataset and a benchmark video surveillance dataset, to demonstrate the promising results of the proposed online LS-OC-SVM method.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1424-8220
1424-8220
DOI:10.3390/s131217130