Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.

Home Page:https://pandas-ai.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

JSON error on any prompt passed. Expecting value: line 1 column 1 (char 0). Set up using Ollama (llama3) on local with LangChain.

hsbsid opened this issue Β· comments

System Info

OS version: Windows
Python version: 3.12.3
The current version of pandasai being used: 2.0.36

πŸ› Describe the bug

I am trying to use pandasAI with Ollama (llama3) locally with langchain. Testing the package with a basic prompt and it results in a JSON error.

Code to reproduce:

import pandas as pd 
from langchain_community.llms import Ollama
from pandasai import SmartDataframe

llm = Ollama(model='llama3', base_url='http://127.0.0.1:11434')

df = pd.read_csv('data.csv', encoding='latin-1')

sdf = SmartDataframe(df, config={"llm": llm, "verbose": True})

sdf.chat("How many rows?") 

Error: json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)

  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\pipelines\chat\generate_chat_pipeline.py", line 307, in run
    output = (self.code_generation_pipeline | self.code_execution_pipeline).run(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\pipelines\pipeline.py", line 137, in run
    raise e
  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\pipelines\pipeline.py", line 101, in run
    step_output = logic.execute(
                  ^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\pipelines\chat\code_generator.py", line 33, in execute
    code = pipeline_context.config.llm.generate_code(input, pipeline_context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\llm\base.py", line 200, in generate_code
    response = self.call(instruction, context)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\pandasai\llm\langchain.py", line 55, in call
    res = self.langchain_llm.invoke(prompt)
          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_core\language_models\llms.py", line 276, in invoke
    self.generate_prompt(
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_core\language_models\llms.py", line 633, in generate_prompt
    return self.generate(prompt_strings, stop=stop, callbacks=callbacks, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_core\language_models\llms.py", line 803, in generate
    output = self._generate_helper(
             ^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_core\language_models\llms.py", line 670, in _generate_helper
    raise e
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_core\language_models\llms.py", line 657, in _generate_helper
    self._generate(
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_community\llms\ollama.py", line 417, in _generate
    final_chunk = super()._stream_with_aggregation(
                  ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_community\llms\ollama.py", line 328, in _stream_with_aggregation
    chunk = _stream_response_to_generation_chunk(stream_resp)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\site-packages\langchain_community\llms\ollama.py", line 20, in _stream_response_to_generation_chunk
    parsed_response = json.loads(stream_response)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\json\__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\json\decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "e:\miniforge3\envs\genai\Lib\json\decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)```