SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

PE added on query and key

XiaoyuShi97 opened this issue · comments

Hi. I see that current version only support PE as a bias weight added to attention map. I wonder if future version supports adding PE on query and key, which is another common way of PE. Thx again for your work and prompt reply!

Hello and thanks for the interest.

Could you possibly refer a paper so we can look into it more?
Our current version follows Swin in applying relative positional biases to attention weights based on the relative position of the queries and keys to each other.

Closing this due to inactivity. If you still have questions feel free to open it back up.