Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.

Home Page:https://pandas-ai.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Unable to pass n_ctx when using Ollama.

Kanishk-Kumar opened this issue · comments

System Info

Title says it all. This, on the other hand doesn't give error, but doesn't work:

from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM

ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="mistral", temperature=0, max_tokens=32768)

But, such as these give error:

from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM

ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="mistral", temperature=0, n_ctx=32768)

Isn't this because its "somehow" using OpenAI to run my local Ollama?:

self.client = OpenAI(base_url=api_base, api_key=api_key).chat.completions

Thanks.

🐛 Describe the bug

Traceback (most recent call last):
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/chat/generate_chat_pipeline.py", line 283, in run
    output = (self.code_generation_pipeline | self.code_execution_pipeline).run(
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/pipeline.py", line 137, in run
    raise e
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/pipeline.py", line 101, in run
    step_output = logic.execute(
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/chat/code_generator.py", line 33, in execute
    code = pipeline_context.config.llm.generate_code(input, pipeline_context)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/base.py", line 196, in generate_code
    response = self.call(instruction, context)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/local_llm.py", line 45, in call
    return self.chat_completion(self.last_prompt, memory)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/local_llm.py", line 36, in chat_completion
    response = self.client.create(**params)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
TypeError: Completions.create() got an unexpected keyword argument 'n_ctx'

@Kanishk-Kumar I confirm it is using the OpenAI-compatible API for Ollama, therefore it's only possible to pass the params supported by that standard at the moment.