Fusing 2D and 3D Clues for 3D Tracking Using Visual and Range Data


Gedik O. S., ALATAN A. A.

16th International Conference on Information Fusion (FUSION), İstanbul, Türkiye, 9 - 12 Temmuz 2013, ss.1966-1973 identifier

  • Yayın Türü: Bildiri / Tam Metin Bildiri
  • Basıldığı Şehir: İstanbul
  • Basıldığı Ülke: Türkiye
  • Sayfa Sayıları: ss.1966-1973
  • Orta Doğu Teknik Üniversitesi Adresli: Evet

Özet

3D tracking of rigid objects is required in many applications, such as robotics or augmented reality (AR). The availability of accurate pose estimates increases reliability in robotic applications and decreases jitter in AR scenarios. Pure vision-based 3D trackers require either manual initializations or offline training stages, whereas trackers relying on pure depth sensors are not suitable for AR applications. In this paper, an automated 3D tracking algorithm, which is based on fusion of vision and depth sensors via Extended Kalman Filter (EKF), which inherits a novel observation weighting method, is proposed. Moreover, novel feature selection and tracking schemes based on intensity and shape index map (SIM) data of 3D point cloud, increases 2D and 3D tracking performance significantly. The proposed method requires neither manual initialization of pose nor offline training, while enabling highly accurate 3D tracking. The accuracy of the proposed method is tested against a number of conventional techniques and superior performance is observed.