johnsmith0031/alpaca_lora_4bit Issues
Support for moe model?
Updated 2monkeypatch problem
Updated 8Unable to Build Wheels
Closed 8Merging LoRA after finetune
Updated 1Targeting all layers and biases
Closed 2Flash Attention 2
Closed 1July
Closed 4Crashes during finetuning
Updated 2how to change into 8 bit
Updated 1Problem with inference
Closed 7fine tune with 2 GPU
Updated 2Version of GPTQ
Updated 3how to infer with finetuned model?
Updated 4Consider using new QLoRA
Updated 3Finetuning 2-bit Quantized Models
Updated 7Code reference request
Updated 1Other datasets
Closed 2