How about use local llm
wzsage opened this issue · comments
"Thank you very much for developing such a fantastic program. However, I have a suggestion. Could we possibly achieve using a local LLM with the OpenAI-like port by
modifying OpenAI's base URL address?"
Yes, you absolutely can! It will be a little tricky because we're using gemini right now but i'll be switching to vercel AI SDK after which model switching will be much much easier.