how to do inference if I want to use other LM rather than openai?
hihihihiwsf opened this issue · comments
Hi,
Thanks for sharing this great work. I trying to experiments icl on my specific domain task, and I have another trained LM model for my task. How can I change Openai API inference to my own trained model's inference?
Hi, thanks for your interests. This repo also supports models from huggingface. So as long as your model is pretrained or finetuned based on models from huggingface.co/models, you can set the model (which can be your local dir) as model_name like
https://github.com/HKUNLP/icl-ceil/blob/e23c539810f4275e1c65f7a3c91b3594aa31c575/scripts/run_bm25.sh#L9C35-L10C1
reopen if needed