TR2012-044

Changedetection.net: A New Change Detection Benchmark Dataset


    •  Goyette, N.; Jodoin, P.-M.; Porikli, F.; Konrad, J.; Ishwar, P., "Changedetection.net: A New Change Detection Benchmark Dataset", IEEE Conference on Computer Vision and Pattern Recognition (CVPR), DOI: 10.1109/CVPRW.2012.6238919, June 2012, pp. 1-8.
      BibTeX Download PDF
      • @inproceedings{Goyette2012jun,
      • author = {Goyette, N. and Jodoin, P.-M. and Porikli, F. and Konrad, J. and Ishwar, P.},
      • title = {Changedetection.net: A New Change Detection Benchmark Dataset},
      • booktitle = {IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
      • year = 2012,
      • pages = {1--8},
      • month = jun,
      • doi = {10.1109/CVPRW.2012.6238919},
      • url = {http://www.merl.com/publications/TR2012-044}
      • }
  • Research Area:

    Computer Vision


TR Image

Change detection is one of the most commonly encountered low-level tasks in computer vision and video processing. A plethora of algorithms have been developed to date, yet no widely accepted, realistic, large-scale video dataset exists for benchmarking different methods. Presented here is a unique change detection benchmark dataset consisting of nearly 90,000 frames in 31 video sequences representing 6 categories selected to cover a wide range of challenges in 2 modalities (color and thermal IR). A distinguishing characteristic of this dataset is that each frame is meticulously annotated for ground-truth foreground, background, and shadow area boundaries - an effort that goes much beyond a simple binary label denoting the presence of change. This enables objective and precise quantitative comparison and ranking of change detection algorithms. This paper presents and discusses various aspects of the new dataset, quantitative performance metrics used, and comparative results for over a dozen previous and new change detection algorithms. The dataset, evaluation tools, and algorithm rankings are available to the public on a website1 and will be updated with feedback from academia and industry in the future.