blazerye / DrugAssist

DrugAssist: A Large Language Model for Molecule Optimization

Home Page:https://arxiv.org/abs/2401.10334

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Issue in code

AKANKSHASINGH233 opened this issue · comments

i am not able to install-flash-attn kindly share entire requirement file.

i was trying with python 3.8

I had a lot of trouble installing as well. The following worked for me. Hope this helps

nvidia-smi

NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0

mamba create -n drugassist4 python=3.8 pip cudatoolkit=11.7 cudatoolkit-dev=11.7 -c conda-forge -y
conda activate drugassist4
pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2
pip install packaging
pip install -r requirements.txt --no-build-isolation

There is no problem using python 3.8 environment. We’ve received feedback regarding dependency issues in certain situations where package requirements cannot be satisfied. We’ve updated requirements.txt, and the change is:
vllm 0.1.3 -> 0.1.4, you can try again with it. Additionally, you can also try the solution provided by ajaymur91.

i am not able to install-flash-attn kindly share entire requirement file.

I faced the same problem, have you solved it?

i am not able to install-flash-attn kindly share entire requirement file.

Me too. Have you solved it?

I had a lot of trouble installing as well. The following worked for me. Hope this helps

nvidia-smi

NVIDIA-SMI 525.147.05 Driver Version: 525.147.05 CUDA Version: 12.0

mamba create -n drugassist4 python=3.8 pip cudatoolkit=11.7 cudatoolkit-dev=11.7 -c conda-forge -y conda activate drugassist4 pip install torch==2.0.1 torchvision==0.15.2 torchaudio==2.0.2 pip install packaging pip install -r requirements.txt --no-build-isolation

I still can't install flash-attn.