ShellRedia / SAM-OCTA

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

The weights after pretraining on OCTA datasets

hwei-hw opened this issue · comments

Thanks for your interesting and valuable work. Could you provide the model weights on the OCTA prertraining? Otherwise, I need to re-pre-train it again.

Thanks for your work and looking forwarding to your reply!

Thank you for your kind words and interest in our work. We appreciate your support and enthusiasm for our research.

Regarding your request for the OCTA pretraining model weights, we are sorry to let you know that, at present, the paper associated with the SAM-OCTA has not been accepted by any journals or conferences. Due to the current status of the work and the ongoing review process, we find ourselves unable to disclose or provide the pretraining parameters publicly.


这篇论文还没被接收,所以暂时不能提供预训练参数,十分抱歉orz

Thanks for your kind reply and good news on your paper soon!

The trained weights have been updated now. The link is added at the end of README.


预训练权重已经更新了,链接添加在了README的末尾。

can you upload to google drive too? Thanks!