Mobile Robot Localization Via Sensor Fusion Algorithms

Uyulan C., ERGÜZEL T. T., Arslan E.

Intelligent Systems Conference (IntelliSys), London, Canada, 7 - 08 September 2017, pp.955-960 identifier

  • Publication Type: Conference Paper / Full Text
  • City: London
  • Country: Canada
  • Page Numbers: pp.955-960
  • Middle East Technical University Affiliated: Yes


In order to make effective works with the mobile robot and maximize its working performance, it is necessary to estimate and track the current pose of the mobile robot. In this paper, under the assumption that the initial pose, kinematics and environmental model of a mobile robot are known, the localization and tracking of the mobile robot's position and orientation have been carried out. The odometry model with the problem of accumulation of unlimited errors is used for tracking the pose, and sensor fusion algorithms are applied to solve this problem. By using the odometry and laser range finder model, the Extended Kalman Filter (EKF), Unscented Kalman Filter (UKF), Unscented Information Filter (UIF), Extended Information Filter (EIF) algorithms were tested on a graphical user interface (GUI) based on occupancy grid maps as an environment model, respectively. In this context, the pose tracking and estimation performances of the non-linear model based estimators are compared to each other. Since occupancy grid maps are utilized, only the laser range finder measurement uncertainty should be considered unlike feature based maps. In this way, the computational complexity can be reduced. When the simulation results are evaluated, it is determined that the Extended Information Filter algorithm has expressed more stable performance in terms of the mobile robot pose estimation.