CYQIQ / MultiCoT

Repository to demonstrate Chain of Table reasoning with multiple tables powered by LangGraph

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Ollama implementation error cuased by FunctionMessage

qqgeogor opened this issue · comments

As I change openai model to ChatOllama model like llama3, the function message has a convert message error since there is only system,user,assistant role in ChatOllama
is there any way to fix the code in an easy way.

And also as I change the model to llama3, it seems even for the first step of viewing dataframe to perform .head() information, llm model fails to generate a respose to call functions which I tried using OllamaFunctions().bind_tools().