What is it about?
This article proposes a method for sensor fusion between odometers, gyroscope, accelerometer, magnetometer and visual landmark localization system. The method is designed for estimation of all 6 degrees of freedom (both translation and attitude) of a wheeled robot moving in uneven terrain. The fusion method is based on continuous estimation of the mean square error of each estimated value and allows different sampling rates of each sensor. Due to the simple implementation, it is suitable for real-time processing in low-cost hardware. In order to evaluate the precision of the estimated position, stochastical models of sensors (with parameters matching real hardware sensors) were used and random trajectories were simulated. The virtual experiments showed that the method is resistant to the failure of any sensor except the odometers; however, each sensor provides an improvement in the resultant precision.
Photo by Sigmund on Unsplash
Why is it important?
The article proposes a new sensor fusion method combining a variety of sensors’ readings. The method utilizes inertial sensors, landmarks, odometers, and magnetometers. The method is suitable for estimating the position of a wheeled vehicle. The simplicity of the method allows its implementation into low-cost hardware.
Read the Original
This page is a summary of: Precise localization of the mobile wheeled robot using sensor fusion of odometry, visual artificial landmarks and inertial sensors, Robotics and Autonomous Systems, February 2019, Elsevier, DOI: 10.1016/j.robot.2018.11.019.
You can read the full text:
The following have contributed to this page