What is it about?

This article deals with the design of an automated system for creating a three-dimensional model of the environment with its texture. The method for creating a three-dimensional model of the environment is based on the use of a two-dimensional scanner for which the supporting hardware has been designed and constructed. The whole system extends the use of a two-dimensional scanner that is embedded in a robotic system. Supporting hardware rotates the scanner around the scan axis. This will create a three-dimensional model of the environment using a two-dimensional scanner. Thus, the resulting three-dimensional scan is formed by subsequent two-dimensional scans, each shifted with respect to the previous one. It was necessary to design the appropriate software for hardware management to control the movement of the engine, the scanner, and to process the measured data.

Featured Image

Why is it important?

The proposed system can be placed on various exploration robotic systems that map the space using the proposed method. Wheeled, band robotic systems or drones can be used to explore hard-to-reach environment.

Perspectives

The obtained 3-D model of the real environment can be used for various purposes in robotics, reconstruction of buildings, historical documentation, or in providing virtual reality, which is especially suitable in dangerous areas where humans would be exposed to excessive danger, but they can control a robot remotely while its movement can be visualized inside a textured 3-D model. The main advantage of our method is its simplicity (360° of view during one scan without the need to change the position of the scanner) and its high precision (the error of the measured position is in the order of millimeters for indoor applications).

Professor Ales Janota
University of Zilina

Read the Original

This page is a summary of: Sensor fusion for creating a three-dimensional model for mobile robot navigation, International Journal of Advanced Robotic Systems, July 2019, SAGE Publications,
DOI: 10.1177/1729881419865072.
You can read the full text:

Read

Contributors

The following have contributed to this page