fabianschenk / REVO

Robust Edge-based Visual Odometry (REVO)

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

There is a problem with running large datasets

RuiYangQuan opened this issue · comments

When the data set of more than 1500 frames was used to verify the algorithm, it was found that it could not fully map all scenes of the data set. Is there a buffer mechanism that makes the algorithm stop drawing after exceeding a certain limit value? Please give me some suggestions, thank you !

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames.
Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

Hi @RuiYangQuan,

Mhm, that should not happen. I tried often with a 30fps sensor and often recorded several minutes ~2k - 5k frames. Does it just stop drawing or does it lose tracking? Please, keep in mind that the code is ~4 years old and there might have been changes to Pangolin or other libraries that might cause an issue.

Thanks for your reply, you are right, there is no problem with real-time operation through sensors, but this problem is often encountered in the face of datasets.

Hi @RuiYangQuan ,

Could you check the dataset config you're using. There's a parameter that limits the number of images to read, e.g. dataset_tum1.yaml:

READ_N_IMAGES: 1000