specarmi / Thermal_SuperPoint_SLAM

Codebase for training a thermal SuperPoint network and vocabulary and integrating them with ORB-SLAM2

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

mono_euroc example

robotdevel opened this issue · comments

Hello author!

i was trying to test your algorithm.

but i have a problem to run kitti example.

./thirdparty/SuperPoint_SLAM/Examples/Monocular/mono_kitti ./superpt_voc.yml.gz thirdparty/SuperPoint_SLAM/Examples/Monocular/KITTI03.yaml ../../dataset/data_odometry_gray/dataset/sequences/03/ ../../dataset/data_odometry_gray/dataset/sequences/03/RGB_Feat_and_Descriptors/

above command is that I followed your suggestion.

I follow your tutorial to test ORB-SLAM2 and then ORB SLAM result is fine

but Super Point rgb SLAM is not working.

did i miss something?

The camera trajectory is not working like ORB-SLAM2.

I made a 'superpt_voc.yml.gz' file using only equences/03/image0

best regard!

Hi! Thank you for your interest in the project!

The issue is probably the vocabulary or the imported features. I'll clarify what I used to get the result shown in our video and report.

Vocabulary: in the command in the README I use vocabularies/superpoint_rgb.yml.gz. This is a vocabulary trained on RGB images using a RGB SuperPoint network. This vocabulary was included in the original SuperPoint_SLAM repo. They provide a download link which I've included here for convenience. This vocabulary was trained on the Bovisa_2008-09-01 dataset, which was the same dataset used to train the ORB vocabulary included with ORBSLAM.

Precomputed features and descriptors: in the command in the README I use ../datasets/kitti/data_odometry_gray/dataset/sequences/03/RGB_Feat_and_Descriptors/. This is a folder containing YAML files with precomputed RGB SuperPoint features and descriptors. I generated the YAML files with the following command:

python generate_keypts_and_desc.py ../trained_networks/superpoint_rgb/rgb.pth.tar ../../datasets/kitti/data_odometry_gray/dataset/sequences/03/image_0/ RGB_Feat_and_Descriptors

It is key that the RGB SuperPoint network is used (trained_networks/superpoint_rgb/rgb.pth.tar) which is originally from the pytorch-superpoint repo here. Note: you might find even better success with the one trained on the KITTI dataset itself (here).

In my experience the results are quite sensitive to the vocabulary and network used. Training a useful vocabulary also seems to be challenging. As is mentioned in the report, using a small and homogeneous dataset for vocabulary training seems to result in an uneven vocabulary with relatively few words and this does seem to harm performance. So the vocabulary you've trained on KITTI sequence 03 might be your issue.

Even with the vocabulary and network I used the performance is not quite as good as standard ORBSLAM on KITTI 03. It takes longer to initialize and fails close to the end of the sequence. We only had time to qualitatively compare the two and deemed it close enough to validate our pipeline. There is likely a lot that can be improved although it is worth pointing out that the original SuperPoint_SLAM authors published a paper that showed inconsistent performance on the KITTI dataset using their RGB SuperPoint network.

Let me know if this helps!

Thank you so much for your reply!

I was successful to run your algorithm using your suggestion! thank you so much.

Superpoint-SLAM had tracking failed problems in my setup. did you happen also?

my last question is that your train image size was [256, 320] #[height, width] ?

I wanted to check your dataset but it is very huge amount so I didn't download yet

but I am curious what did you train the image for Thermal-super point.

thank you again!

Glad you were able to get it working!

In my experience it fails to track towards the end of KITTI sequence 03 but performs well for the majority of the sequence.

You can find the training details in section IV.A of the report.

Thank you so much for your reply!

I was successful to run your algorithm using your suggestion! thank you so much.

Superpoint-SLAM had tracking failed problems in my setup. did you happen also?

my last question is that your train image size was [256, 320] #[height, width] ?

I wanted to check your dataset but it is very huge amount so I didn't download yet

but I am curious what did you train the image for Thermal-super point.

thank you again!

Hi, I had met a similar problem, could you please help me to solved it?Can I know what did you do to run it successful in the end?
Very thanks!:pray: