DUDMap: 3D RGB-D mapping for dense, unstructured, and dynamic environment

Hastürk Ö., Erkmen A. M.

International Journal of Advanced Robotic Systems, vol.18, no.3, 2021 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 18 Issue: 3
  • Publication Date: 2021
  • Doi Number: 10.1177/17298814211016178
  • Journal Name: International Journal of Advanced Robotic Systems
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Applied Science & Technology Source, Communication Abstracts, Compendex, INSPEC, Directory of Open Access Journals
  • Keywords: Dynamic mapping, visual SLAM, localization, 3D reconstruction, TRACKING, SLAM, RECONSTRUCTION, SCALE
  • Middle East Technical University Affiliated: Yes


© The Author(s) 2021.Simultaneous localization and mapping (SLAM) problem has been extensively studied by researchers in the field of robotics, however, conventional approaches in mapping assume a static environment. The static assumption is valid only in a small region, and it limits the application of visual SLAM in dynamic environments. The recently proposed state-of-the-art SLAM solutions for dynamic environments use different semantic segmentation methods such as mask R-CNN and SegNet; however, these frameworks are based on a sparse mapping framework (ORBSLAM). In addition, segmentation process increases the computational power, which makes these SLAM algorithms unsuitable for real-time mapping. Therefore, there is no effective dense RGB-D SLAM method for real-world unstructured and dynamic environments. In this study, we propose a novel real-time dense SLAM method for dynamic environments, where 3D reconstruction error is manipulated for identification of static and dynamic classes having generalized Gaussian distribution. Our proposed approach requires neither explicit object tracking nor object classifier, which makes it robust to any type of moving object and suitable for real-time mapping. Our method eliminates the repeated views and uses consistent data that enhance the performance of volumetric fusion. For completeness, we compare our proposed method using different types of high dynamic dataset, which are publicly available, to demonstrate the versatility and robustness of our approach. Experiments show that its tracking performance is better than other dense and dynamic SLAM approaches.