In this letter, we propose a robust, real-time tightly-coupled
multi-sensor fusion framework, which fuses measurements from
LiDAR, inertial sensor, and visual camera to achieve robust and
accurate state estimation. Our proposed framework is composed of
two parts: the filter-based odometry and factor graph
optimization. To guarantee real-time performance, we estimate
the state within the framework of error-state iterated
Kalman-filter, and further improve the overall precision with
our factor graph optimization. Taking advantage of measurements
from all individual sensors, our algorithm is robust enough to
various visual failure, LiDAR-degenerated scenarios, and is able
to run in real time on an on-board computation platform, as
shown by extensive experiments conducted in indoor, outdoor, and
mixed environments of different scale (see attached video
https://youtu.be/9lqRHmlN_MA). Moreover, the results show that our proposed framework can
improve the accuracy of state-of-the-art LiDAR-inertial or
visual-inertial odometry. To share our findings and to make
contributions to the community, we open source our codes on our
Github:
https://github.com/hku-mars/r2live.