Allowing more LLM models for Langchain::Assistance
mengqing opened this issue · comments
Currently it only supports OpenAI, and I can see that there's a PR for supporting Gemini #513
Is it better to make it more generic and make it extendable? Eg: be able to just extend a new LLM model without hard coding if conditions in the base class.
A lot of other LLM uses OpenAI standard specifications
@mengqing Yes, I agree with you. It's best kept to be as generic as possible. Do you have any specific ideas/changes to propose? The current Gemini PR is a rough draft before I've thought about any abstractions at all. I've been thinking about maybe implementing a base/parent class and child classes that override specific methods, constants, etc.