[FIXED] Bug in the self-attention GNN.
zgojcic opened this issue · comments
As correctly noticed by @zhulf0804 (thank you), there was a bug in our implementation of the GNN, which actually led to lower performance of PREDATOR. The bug was fixed in the Pull Request #14. We have now retrained the model on the 3DMatch dataset and obtained a higher performance (see Figure below). We are also retraining PREDATOR on other dataset and will update the tables in the following days.
Note, due to the change of the GNN architecture old pretrained models will not work anymore, so make sure to always have the latest version.
In the similar manner we are now also retraining the ModelNet and KITTI models and will update the arxiv paper once all results are available. If you have any questions in the meantime please just contact us directly.
Best,
Zan
This issue has now been fixed for all the models with both backbones (MinkowskiEngine and KPConv). The fix increases the performance across all tasks. New values can be seen in the updated version of the paper.
We have also updated all the pretrained models such that it should be possible to reproduce the latest values.
Thank you for your patience.
Best,
the authors