iMED-Lab / OCTA-Net-OCTA-Vessel-Segmentation-Network

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Cria/Rose-2: my results are way worse than the paper's

BenjaminBo opened this issue · comments

I ran your code's first- and second stage and I am confused why I get such worse results than are described in the paper.

I ran the first stage with the resnest50-528c19ca.pth and I linked for my second stage with the [...]best-fusion.pth, [...]best-thick.pth and the [...]best-thin.pth models, generated by the first stage, to the suffix-variables.

Quantitative results for the fusion step:

octa_acc

octa_auc

octa_dice

octa_iou

AUC and ACC are fairly close, but DICE, the metric I find the most relevant, caps at around 0.41, which is way worse than what you wrote in your paper (0.7077)

Qualitative results for the fusion step:

11_OD_SVP

25_OD_SVP

25_OS_SVP

We were interested in this paper because of the prospect of filtering out artifacts on OCTA images. These images above are probably the most extreme examples on the test set, but I don't see the models that I have trained, as acceptable enough to use.

Conclusion:
I have no trouble believing that I messed up somewhere, which would explain why I get such bad results. I just don't know where.

I have not used the front_model-189-0.9019971996250081.pth for the second stage, that you linked here. The download doesn't give me the file that I want but an .exe called BaiduNetdisk_7.19.0.18. I don't know what to do with that and I have no administrative rights here, so I can't launch the installation that it is trying to launch.
Could you maybe upload the front_model-189-0.9019971996250081.pth - like you did for resnest50-528c19ca.pth (#2) - on a drive? I would like to know if I can reproduce the results of your paper with that model.

Also concerning the front_model-189-0.9019971996250081.pth: Is this a fusion-, thin- or thick-model?

I just read here that you said, that both stages had to be run with the fixes given in the thread. I ran my first stage without those fixes and then my second stage with them.

I am now rerunning both stages with the fixes and will get back at you, when I have the results.

Okay so what I have reported in my first comment still holds true. I was not able to reproduce your results.

Accordingly, it would be great if you could upload the front_model-189-0.9019971996250081.pth on a google drive - like you did for the resnest50-528c19ca.pth - and explain what variable to set to its path.

commented

@BenjaminBo Would you please rephrase how you run this code, I'm facing with this error when it's trying to load resnest50-528c19ca.pth from url_format variable indicated in resnest.py

_url_format = 'https://hangzh.s3.amazonaws.com/encoding/models/{}-{}.pth'

request.py", line 649, in http_error_default
raise HTTPError(req.full_url, code, msg, hdrs, fp)
urllib.error.HTTPError: HTTP Error 403: Forbidden

I downloaded the resnest50-528c19ca.pth as described in #2. It's the second link I believe. I then used the downloaded .pth in the repos.

I'm not sure what this error is, that you're showing me, but you seem to download it at runtime? Maybe that's the problem.

commented

@BenjaminBo Thank you so much for the contribution 🙏🏻

No problem. If you get to reproduce the results from the paper or get better results than I did, tell me how you did it :)

commented

@BenjaminBo How was your results on front_main step? I got the results on that step almost like the paper