mit-han-lab / smoothquant

[ICML 2023] SmoothQuant: Accurate and Efficient Post-Training Quantization for Large Language Models

Home Page:https://arxiv.org/abs/2211.10438

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

support GPTNEOX model

amazingkmy opened this issue · comments

Hi
I am working on quantization using the gptneox model.
During the quantization process, I got the following message.

"You are using a model of type gpt_neox to instantiate a model of type opt. This is not supported for all configurations of models and can yield errors."

Is GPTNEOX conversion not possible?