HazyResearch / hyena-dna

Official implementation for HyenaDNA, a long-range genomic foundation model built with Hyena

Home Page:https://arxiv.org/abs/2306.15794

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

need to swap layer norm op for triton-based layer norm?

ankitvgupta opened this issue · comments

In the Flash-attention repo here, there is now a note that the fused CUDA op has been replaced with a Triton op.

in light of that, is it now reasonable to remove from the dependencies section of this readme the suggestion to pip install the layer norm op?

It looks like on this line, you check if the custom layer norm op is installed. if so, this param is set to true. Following the call stack, that sets this param in the Flash-Attention package. That implementation here has moved to a Triton implementation.

However, later in the original hyena-DNA code, we are using the non-Triton function. Does that need to be swapped out?

relevant PR: Dao-AILab/flash-attention@abbc131

In case the answer is yes, think this should do it: #58