Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.

Home Page:https://pandas-ai.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Allow Custom API Base url

TebooNok opened this issue Β· comments

πŸš€ The feature

Pandasai.OpenAI class not supports custom an API Base Url.

Motivation, pitch

I deploy my own API via one-api to transfer request to openai. I have to modify the code in pandas ai to use my custom api.

from openai import OpenAI
client = Openai(
    api_key=<my custom one-api key>,
   api_base=<my custom one-api endpoint>
)

Alternatives

Allow transfer init args from pandasai.OpenAI() to openai.OpenAI

llm = pandasai.OpenAI(api_base=<my custom one-api endpoint>)

Additional context

No response

This is how I add my api in pandasai.llm.openai.py

        # set the client
        model_name = self.model.split(":")[1] if "ft:" in self.model else self.model
        if model_name in self._supported_chat_models:
            self._is_chat_model = True
            self.client = (
                openai.OpenAI(**self._client_params, base_url="<my custom api endpoint>").chat.completions
                if is_openai_v1()
                else openai.ChatCompletion
            )
        elif model_name in self._supported_completion_models:
            self._is_chat_model = False
            self.client = (
                openai.OpenAI(**self._client_params, base_url="<my custom api endpoint>").completions
                if is_openai_v1()
                else openai.Completion
            )
        else:
            raise UnsupportedModelError(self.model)

It's ok, in 2.0.26, this is fixed.