Occlusion-aware 3D multiple object tracking for visual surveillance


Tezin Türü: Doktora

Tezin Yürütüldüğü Kurum: Orta Doğu Teknik Üniversitesi, Mühendislik Fakültesi, Elektrik ve Elektronik Mühendisliği Bölümü, Türkiye

Tezin Onay Tarihi: 2013

Öğrenci: OSMAN TOPÇU

Danışman: ABDULLAH AYDIN ALATAN

Özet:

This thesis work presents an occlusion-aware particle filter framework for online tracking of multiple people with observations from multiple cameras with overlapping fields of view for surveillance applications. Surveillance problem involves inferring motives of people from their actions, deduced from their trajectories. Visual tracking is required to obtain these trajectories and it is a challenging problem due to motion model variations, size and illumination changes and especially occlusions between moving objects. By the expense of increasing number of cameras, tracking in 3D world coordinates is preferred over its 2D counterpart due to decreased viewpoint dependency and better occlusion handling through ordering of the targets. In this novel algorithm, our contributions are (1) observation error is increased in accordance with the estimated occlusion probability so that uncertainty in object location during any occlusion is taken into consideration, (2) increased observation error causes particles to expand, that in turn widens the association gate so that location uncertainty during occlusion is included into the association gate decreasing the chance to lose the object during occlusion, (3) observation error is allowed to change by the dimension of the associated silhouette so that distant and close objects can be tracked without parameter adjustment, while the particles are also saved from degeneration, when silhouettes are merged during occlusion. The improved performance of the proposed tracker is demonstrated in comparison with other state-of-the-art trackers using PETS 2009, as well as EPFL and PETS 2006 datasets. Furthermore, the proposed algorithm is extended for the surveillance systems whose cameras are not synchronized in time. Time diff erence between cameras are estimated by Gaussian mixture kernel density estimator and compensated by particle filter trackers. The experiments indicate that the proposed algorithm is able to work, when the amount of time di fference between cameras is within one second. Finally, the position and velocity states of the proposed algorithm is divided into linear and nonlinear parts in a Rao-Blackwellized fashion. In this formulation, velocity is marginalized by a Kalman filter, while the object position is filtered through a particle filter. It is shown that the resulting algorithm performs competitively, when number of particles are reduced without performance degradation.