rmihaylov / falcontune

Tune any FALCON in 4-bit

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

FalconLite support

hvico opened this issue · comments

Hello.

Amazon has released a 4-bit quantized version of Falcon40B that supports long context (up to 11K tokens).

It would be great to be able to finetune that extended context version using falcontune on a custom dataset.

https://huggingface.co/amazon/FalconLite
https://medium.com/@chenwuperth/extend-the-context-length-of-falcon40b-to-10k-85d81d32146f

Thanks for the great work!

Hi

Any success finetuning FalconLite model?