SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

about the kernel size

XiaoyuShi97 opened this issue · comments

Hi, I find that there is no explicit parameter specifying kernel size in natten.py. How does cuda code get it? By the shape of rpb? Thanks!

Hello, thanks for your interest.
Yes, RPB is always in the shape of 2 * kernel_size - 1 across the two axes, for every head, therefore it's easy to obtain kernel_size from that.

Thanks for your prompt reply!

Closing this due to inactivity. If you still have questions feel free to open it back up.