support GPTNEOX model
amazingkmy opened this issue · comments
amazingkmy commented
Hi
I am working on quantization using the gptneox model.
During the quantization process, I got the following message.
"You are using a model of type gpt_neox to instantiate a model of type opt. This is not supported for all configurations of models and can yield errors."
Is GPTNEOX conversion not possible?