exl2 autosplit
noobhunterd opened this issue · comments
noobhunterd commented
Describe the bug
Im using this model:
https://huggingface.co/LoneStriker/airoboros-70b-3.3-2.4bpw-h6-exl2
If i load it under 1 GPU, it works perfectly with 2k context + 8bit
But if I use autosplit under 2 GPUs as 8k context, it responses in nonsense
Is there some option that I should tick to make it work? Just updated to the latest version of text gen and L2 version of airoboros works fine with autosplit on
Is there an existing issue for this?
- I have searched the existing issues
Reproduction
latest version of text gen
load this model: https://huggingface.co/LoneStriker/airoboros-70b-3.3-2.4bpw-h6-exl2 w/ autosplit and 8k context
Screenshot
No response
Logs
Instruct:
1 GPU log:
Input: hello
Output: Hello! I'm here to help you with any questions or tasks you may have. What can I assist you with today?
Autosplit log:
Input: hello
Output: The string to the text-to-text model 1001
System Info
4090 FE + 3090 MSI via eGPU