In this work we present a method for fusion of direct radiometric data from a thermal camera with inertial measurements to enable pose estimation of aerial robots.
Thermal cameras such as those operating in the Long Wave IR range are not affected by the lack of illumination and the presence of obscurants such as fog and dust. These characteristics make them a suitable choice for robot navigation operations in GPS-denied, completely dark, and obscurant filled environments.
In contrast to previous approaches, which use 8-bit re-scaled thermal imagery as a complementary sensing modality to visual image data, our approach makes use of the full 14 bit - radiometric data making it generalizable to a variety of environments without the need of heuristic tuning.
Furthermore, our approach implements a key-frame based joint optimization scheme, making odometry estimates robust against image data interruption, which is common during the operation of thermal cameras due to the application of flat field corrections.
Our experiments in a completely dark, indoor environment demonstrate the reliability of our approach by comparing our estimated odometry against the ground truth provided by a VICON system.
To put our results into perspective and due to the limited literature in thermal vision fusion, we compare our method with state-of-the-art visual and visual-inertial odometry approaches, thus demonstrating the efficacy of our solution and the benefits of utilizing full radiometric information.
We also demonstrate the reliable performance of our approach in a real world application by estimating the pose of an aerial robot navigating through an underground mine in conditions of darkness and in the presence of heavy airborne dust.
Ещё видео!