A novel vision-based calibration framework for industrial robotic manipulators

Balanji H. M., TURGUT A. E., Tunc L. T.

ROBOTICS AND COMPUTER-INTEGRATED MANUFACTURING, vol.73, 2022 (SCI-Expanded) identifier identifier

  • Publication Type: Article / Article
  • Volume: 73
  • Publication Date: 2022
  • Doi Number: 10.1016/j.rcim.2021.102248
  • Journal Indexes: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Aerospace Database, Applied Science & Technology Source, Business Source Elite, Business Source Premier, Communication Abstracts, Compendex, Computer & Applied Sciences, INSPEC, Metadex, Civil Engineering Abstracts
  • Keywords: Industrial robot calibration, POE, Computer vision, Fiducial marker system, Vision-based robot calibration, KINEMATIC CALIBRATION, ACCURACY, SYSTEM, GENERATION, PRODUCT, MODELS
  • Middle East Technical University Affiliated: Yes


With the increasing involvement of industrial robots in manufacturing processes, the demand for high quality robots has increased considerably. A high-quality robot is a robot having good repeatability and accuracy. Industrial robots are known to have very good repeatability, however it is not the same with accuracy. Due to harsh working conditions, accuracy of robots deteriorate over time. Calibration is a practical approach to sustain accuracy. In calibration, position and orientation of the tool center point (TCP) of a robot arm should be corrected using a tracking device with higher accuracy. Different devices such as laser-trackers, optical CMMs, and stereo cameras have been used in the literature. In this paper, a novel calibration framework is proposed based on a single camera and computer vision techniques using ArUco markers. The product of exponentials method is used for kinematic modeling of the robot to avoid the singularity. The performance of the framework is tested using computer-based simulations and using a six degree of freedom (6-DOF) UR5 robotic manipulator. Position and orientation errors are used as metrics in the experiments. The position and orientation errors in real world experiments reached to 2.5 mm and 0.2 degrees, respectively. The result shows that the method is usable in real world scenarios.