SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

freeze_at be set to 2 to freeze the pretrained weight downloaded from the official website?

aihcyllop opened this issue · comments

In the dinated neighborhood transformer, run instance segmentation on coco,
How to freeze the pretrained weight downloaded from the official website? Should freeze_at be set to 2?

Thanks for your interest.
If you're referring to the mask rcnn experiments, it should be set to 5 to freeze the entire backbone.
The default, -1, does not freeze anything, 0 freezes only the patch embedding, 1 is ineffective because there is no absolute positional embedding to be frozen, 2 freezes the first of the 4 transformer encoder blocks, all the way through 5, which would freeze everything.

However, note that if your intention is only to run inference, you do not need to do any of this, and can simply refer to the inference scripts.

Closing due to inactivity. Feel free to reopen if you still have questions.