3D Environment Visualization by Laser Pointcloud Stitching
This technology enables the operator to steer a robot safely through a previously unknown and potentially dangerous 3D environment. The operator is presented a (live) 3D visualization of the environment and the robot itself. This kind of information is by far superior to onboard camera images and allows for complicated maneuvers while avoiding obstacles (i.e. the robots umbilical cable). The 3D information can be recorded and used as a basis for (offline) map generation.
3D Environment Visualisazion
The robot carries a rotating 3D Laser Range Finder (specially developed for this application). The laser data, consisting of 3D point clouds is transmitted to a control PC where it is processed. The processing involves matching new data to previously collected points in such a way that a coherent point cloud results which is correctly representing the environment. On the PC, a viewer application allows the operator to observe this representation in a convenient way.
How it works
The navigation system has a modular architecture. This is mainly possible due to the use of Ethernet for communications. Depending on which modules are mounted / used, the navigation filter on the PC computer can provide a better or rather rough localization estimation. The software architecture is based on individually coded
modules which communicate over inter-process channels in order to share information.
· The motor control software (which includes the navigation filter)
· Laser point cloud processor
· The 3D visualization module
· The video recording and playback module
Related Projects & Products
The 3D laser pointcloud generation and stitching was first implemented on the FAST platform. However, having a modular platform this device can be easily adapted to other robots & platforms.