There are 36 repositories under visual-inertial-odometry topic.
Robust Stereo Visual Inertial Odometry for Fast Autonomous Flight
Visual Inertial Odometry with SLAM capabilities and 3D Mesh generation.
Roadmap to become a Visual-SLAM developer in 2023
IMU + X(GNSS, 6DoF Odom) Loosely-Coupled Fusion Localization based on ESKF, IEKF, UKF(UKF/SPKF, JUKF, SVD-UKF) and MAP
HybVIO visual-inertial odometry and SLAM system
VIO_Learning. A simple implement of mono msckf in https://github.com/TurtleZhong/msckf_mono/tree/master/src/msckf_mine_1.0
An Authentic Dataset for Visual-Inertial Odometry
FLVIS: Feedback Loop Based Visual Inertial SLAM
This repository intends to enable autonomous drone delivery with the Intel Aero RTF drone and PX4 autopilot. The code can be executed both on the real drone or simulated on a PC using Gazebo. Its core is a robot operating system (ROS) node, which communicates with the PX4 autopilot through mavros. It uses SVO 2.0 for visual odometry, WhyCon for visual marker localization and Ewok for trajectoy planning with collision avoidance.
A project of Visual Inertial Odometry for Autonomous Vehicle
Code for "Efficient Deep Visual and Inertial Odometry with Adaptive Visual Modality Selection", ECCV 2022
PRCV 2022: The FusionPortable-VSLAM Challenge
This project aims to hardware-synchronize camera and IMU so that both use the same (millisecond precise) time base.
"Visual-Inertial Dataset" (RA-L'21 with ICRA'21): it contains harsh motions for VO/VIO, like pure rotation or fast rotation with various motion types.
visual-inertial odometry (VINS-Mono) with motion-aware feature selection
A CUDA reimplementation of Bundle Adjustment for VINS-Fusion
iOS utility to save ARKit results (Visual-Inertial Odometry) to a series of text files for offline use.
Deep Learning for Visual-Inertial Odometry
Android app to save ARCore results (Visual-Inertial Odometry) to a series of text files for offline use.
A fully-annotated, open-design dataset of autonomous and piloted high-speed flight
Final project for EECE-5698 Robot sensing and navgation.
A bare-metal implementation of visual-inertial odometry on a microcontroller. This project is associated with my master's thesis in Engineering Cybernetics at NTNU.
Underwater Dataset for Visual-Inertial Methods and data with transitioning between multiple refractive media.
VO and VIO pipelines for pose estimation with a quad-rotor