Dao-AILab/flash-attention Issues
【Hopper GPU】
Closed 1FA3 [Varlen Support]?
UpdatedCan FA 3 support other head_dim?
Updated 3flash-attn3 supported L20?
Updated 2Failed to build flash-attn
Updated 2Failed to build flash-attn
Updated 1Does HuaWei 910B support?
Closed 5Question on FA-2 worker scheme
Closed 3How to i install this?
Updated 3Flash Attention 3 3090 support
Updated 4build failure
Updated 4[FA3] Performance confusion
Updated 3FA3 for Decoding phase
Updated[FA3] release and wheels
UpdatedBuild flash-attn takes a lot of time
Updated 1How to debug?
Closed 1Logit soft-capping
Closed 6H100 slower than A800
Closed 2