rmihaylov / falcontune

Tune any FALCON in 4-bit

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Missing compatibility with with torch 1.13

phisad opened this issue · comments

When we run the model on our servers, then we encounter the following problem:

File "/home/users//.cache/huggingface/modules/transformers_modules/tiiuae/falcon-40b-instruct/5b9409410d251ab8e06c48078721c8e2b71fa8a1/modelling_RW.py", line 289, in forward
attn_output = F.scaled_dot_product_attention(
AttributeError: module 'torch.nn.functional' has no attribute 'scaled_dot_product_attention'

Are there any plans to make the model also work on a bit older versions of PyTorch?

I think this would be useful as many people are still on 1.13.