SHI-Labs / Neighborhood-Attention-Transformer

Neighborhood Attention Transformer, arxiv 2022 / CVPR 2023. Dilated Neighborhood Attention Transformer, arxiv 2022

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

When will the code be released?

linjing7 opened this issue · comments

Hi, I think your work has a superising performance. But the parallesim is a problem I'm concerned. When will you release your code?

Hi, thank you for your interest.

We are currently in the process of preparing the public release of our code, training scripts, config files and checkpoints. We aim to release the code specifically in the coming days, so please stay tuned for that.

As for parallelism, we wrote our own CUDA kernel to compute NA, and it is quite fast in its current form. It is not as fast as it theoretically can be, but those are optimizations that we plan to explore in the near future.

I'm going to close this issue now, but feel free to reopen it if you have other questions.

Okay, looking forword to your code :).

looking forword your code