Unsupervised Discovery of Normal and Abnormal Activity Patterns in Indoor and Outdoor Environments

Dario Dotti*, Mirela Popa, Stylianos Asteriadis

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference article in proceedingAcademicpeer-review

3 Citations (Web of Science)


In this paper we propose an adaptive system for monitoring indoor and outdoor environments using movement patterns. Our system is able to discover normal and abnormal activity patterns in absence of any prior knowledge. We employ several feature descriptors, by extracting both spatial and temporal cues from trajectories over a spatial grid. Moreover, we improve the initial feature vectors by applying sparse autoencoders, which help at obtaining optimized and compact representations and improved accuracy. Next, activity models are learnt in an unsupervised manner using clustering techniques. The experiments are performed on both indoor and outdoor datasets. The obtained results prove the suitability of the proposed system, achieving an accuracy of over 98% in classifying normal vs. abnormal activity patterns for both scenarios. Furthermore, a semantic interpretation of the most important regions of the scene is obtained without the need of human labels, highlighting the flexibility of our method.
Original languageEnglish
Title of host publicationVISAPP 2017 12th International Conference on Computer Vision Theory and Applications, Porto, Portugal, 27 February - 1 March 2017
Number of pages8
ISBN (Print)9789897582264
Publication statusPublished - 2017
Event12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP) - Porto, Portugal
Duration: 27 Feb 20171 Mar 2017


Conference12th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP)
Internet address


  • Ambient Assisted Living
  • Video Surveillance
  • Unsupervised Learning
  • Movement Histograms
  • Scene Understanding

Cite this