transformerlab / transformerlab-app

Experiment with Large Language Models

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Need to install Flash Attention after all other requirements

aliasaria opened this issue · comments

Our requirments.txt includes flash attention but it's not possible to install flash attention without torch and a few other libraries already installed. We need a way to install all requirements other than flash attention and then install it afterwards.