JingyunLiang / SwinIR

SwinIR: Image Restoration Using Swin Transformer (official repository)

Home Page:https://arxiv.org/abs/2108.10257

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

swin layer

jiaaihhy opened this issue · comments

In swin transformer, self attention module conclude two subnet, one is simple windows self attention, another is shifted windows.in that code, in normal first windows self attention, there is no attn_mask, second shifted windows have mask. but in your code, seemly every self-attention layers have the attn_mask. that means every swin layer dont have windows self-attn, instead by all shifted windows in every layer? thank you

For non-shift layers, we set attn_mask=None. See

attn_mask = None

Feel free to open it if you have more questions.