apocas / restai

RestAI is an AIaaS (AI as a Service) open-source platform. Built on top of LlamaIndex, Ollama and HF Pipelines. Supports any public LLM supported by LlamaIndex and any local LLM suported by Ollama. Precise embeddings usage and tuning.

Home Page:https://apocas.github.io/restai/

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Alternative way to import OPENAI?

rjtormis opened this issue · comments

I have a deployed openai service in azure and I want to utilize it, is there anyway to import besides using OPENAI_API_KEY ?

AZURE_OPENAI_API_ENDPOINT=
AZURE_OPENAI_API_KEY=
AZURE_OPENAI_API_VERSION=
AZURE_OPENAI_API_INSTANCE_NAME=
AZURE_OPENAI_API_DEPLOYMENT_NAME=
AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME=

their format looks like this. Thanks!

Yes you can pass any constructor parameters, just edit a LLM in the frontend and change the options field with a json with all the options accepted by the class you selected.

Example:
{model="gpt-35-turbo-16k", deployment_name="my-custom-llm", api_key="xxxxxxxxx", azure_endpoint="https://<your-resource-name>.openai.azure.com/", api_version="2023-07-01-preview"}

Additionally I added llamaindex's AzureOpenAI class to the supported LLMs, it allows a better integration with Azure:
e853a34#diff-08524b3ea22905074b8c7494a1dfc7d8f430f76f38b14d443d3eb0ae99093348R40