Ollama implementation error cuased by FunctionMessage
qqgeogor opened this issue · comments
As I change openai model to ChatOllama model like llama3, the function message has a convert message error since there is only system,user,assistant role in ChatOllama
is there any way to fix the code in an easy way.
And also as I change the model to llama3, it seems even for the first step of viewing dataframe to perform .head() information, llm model fails to generate a respose to call functions which I tried using OllamaFunctions().bind_tools().