SpectacularAI / HybVIO

HybVIO visual-inertial odometry and SLAM system

Home Page:https://arxiv.org/abs/2106.11857

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Test without additional sensor data

v-diepttn147 opened this issue · comments

Nice work!

I have one question about the input data. As stated in your paper:

Without additional inputs, these methods can only estimate the location relative to the starting point but provide no global position information.

Which means your code still works when there is only one video data, right? How can I modify or how can I set flags to feed only one video in for testing?

Thank you so much.
Diep Tran

The method in this codebase requires at least an IMU sensor (accelerometer + gyroscope) and one timestamped camera image stream. It will not work, for example, with a single video stream without IMU. There is a parameter called -useStereo that can be used to control how many image streams are used, in case stereo data is available