How do you save a model after using WOQ?
eduand-alvarez opened this issue · comments
Describe the issue
I'm unsure how to save a model after it has been quantized using WOQ and then loading it using HF or PyTorch. Here is my code so far.
import torch
import intel_extension_for_pytorch as ipex
from transformers import AutoTokenizer, AutoModelForCausalLM
PART 1: Model and tokenizer loading
tokenizer = AutoTokenizer.from_pretrained("Intel/neural-chat-7b-v3-3")
model = AutoModelForCausalLM.from_pretrained("Intel/neural-chat-7b-v3-3")
PART 2: Preparation of quantization config
qconfig = ipex.quantization.get_weight_only_quant_qconfig_mapping(
weight_dtype=torch.qint8, # or torch.quint4x2
lowp_mode=ipex.quantization.WoqLowpMode.NONE, # or FP16, BF16, INT8
)
checkpoint = None # optionally load int4 or int8 checkpoint
PART 3: Model optimization and quantization
model = ipex.llm.optimize(model, quantization_config=qconfig, low_precision_checkpoint=checkpoint)
I want to save model. Can you help?
Hello @eduand-alvarez, you can use the following example to save the quantized mode:
# Save the quantized model
torch.jit.save(torch.jit.script(model), "quantized_model.pt")
In this example, we use PyTorch's JIT (Just-In-Time) compiler to save the quantized model as a TorchScript module. The torch.jit.script
function converts the PyTorch model into a TorchScript module, which can be saved to a file using torch.jit.save
.
The resulting quantized_model.pt
file contains the quantized model weights and architecture, which can be loaded using PyTorch or Hugging Face.
To load the quantized model using PyTorch:
loaded_model = torch.jit.load("quantized_model.pt")
To load the quantized model using Hugging Face:
from transformers import AutoModelForCausalLM
loaded_model = AutoModelForCausalLM.from_pretrained("quantized_model.pt", from_tf=False)
Note that when loading the quantized model using Hugging Face, you need to set from_tf=False
to indicate that the model is not a TensorFlow checkpoint
Hope this can be helpful to you.
This is great! Thank you!