mosaicml / llm-foundry

LLM training code for Databricks foundation models

Home Page:https://www.databricks.com/blog/introducing-dbrx-new-state-art-open-llm

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

flash attention 2 setup.py

germanjke opened this issue · comments

Hi, can I ask you please why you have 2 different options: gpu and gpu-flash2? By default i'm just installing gpu and it have errors on flash2. Why we need gpu then? I guess gpu-flash-2 gets everything from gpu with flash 2 attention?

Thanks!

https://github.com/mosaicml/llm-foundry/blob/main/setup.py#L100

gpu will install all the gpu requirements including flash attention v1. gpu-flash2 will install all the gpu requirements including flash attention v2.