More demonstrations can be seen in followings:
* Project website: [ Ссылка ]
* Three kinds of feature detection and tracking: [ Ссылка ]
* Evaluation in UZH-FPV: [ Ссылка ]
* Onboard quadrotor flighting evaluation: [ Ссылка ]
* Outdoor large-scale evaluation: [ Ссылка ]
* Our pervious EIO work (IROS 2022): [ Ссылка ]
Abstract: Robust and reliable state estimation in challenge situations, e.g. aggressive motion, is still an unsolved problem, especially achieving onboard state feedback control for aggressive motion. In this paper, we proposed robust and real-time event-based visual-inertial odometry (VIO) with event, image, and inertial measurement. In particular, we design the line-based event features to provide additional structure or constraint information in the human-made scene, while the point-based event and image features provide a good supplement for each other through well-design feature management. Finally, the point-based and line-based visual residual from the event camera, the point-based visual residual from the standard camera, and the residual from IMU pre-integration are tightly-coupled fused in a keyframe-based graph optimization framework to provide reliable state estimation. Experiments in the public benchmark datasets show that our method can achieve superior performance compared with the state-of-the-art image-based or event-based VIO. We also use our pipeline to demonstrate onboard closed-loop quadrotor aggressive flight and large-scale outdoor experiments.
