SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Tiny Bug in nattencuda.py

z-jiaming opened this issue · comments

Great Work!

Well, I found a small bug in nattencuda.py.

pad_r = max(0, self.window_size - W)

it should be self.kernel_size during padding if feature size is small than kernel_size

Thank you for your interest, and for bringing this to our attention.