Add function to allow for multiple AI Agents for Azure Deployed Endpoints for LLMs
orngeatom opened this issue · comments
How can we request a new function to add New Chat buttons that use the OpenAI connection but also create New Chat buttons that use Azure Deployed Endpoints for real time inference? Can there be a variable in the models.ts that stores which AI Model and the services to contain the endpoint consumption information and use the existing chat service structure?