langchain-ai / langchain

🦜🔗 Build context-aware reasoning applications

Home Page:https://python.langchain.com

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

OllamaFunctions returning type Error when using with_structured_output

SatouKuzuma1 opened this issue · comments

Checked other resources

  • I added a very descriptive title to this issue.
  • I searched the LangChain documentation with the integrated search.
  • I used the GitHub search to find a similar question and didn't find it.
  • I am sure that this is a bug in LangChain rather than my code.
  • The bug is not resolved by updating to the latest stable version of LangChain (or the specific integration package).

Example Code

I'm using this simple code and its returning an error.
I cheked the documentation and the example is something similar to what I'm trying to do.

from langchain_experimental.llms.ollama_functions import OllamaFunctions

class RelatedSubjects(BaseModel):
    topics: List[str] = Field(
        description="Comprehensive list of related subjects as background research.",
    )

ollama_functions_llm = OllamaFunctions(model="llama3",format='json')
expand_chain = gen_related_topics_prompt | ollama_functions_llm.with_structured_output(
    RelatedSubjects
)

related_subjects = await expand_chain.ainvoke({"topic": example_topic})
related_subjects

Error Message and Stack Trace (if applicable)

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[537], [line 1](vscode-notebook-cell:?execution_count=537&line=1)
----> [1](vscode-notebook-cell:?execution_count=537&line=1) related_subjects = await expand_chain.ainvoke({"topic": example_topic})
      [2](vscode-notebook-cell:?execution_count=537&line=2) related_subjects

File [~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536), in RunnableSequence.ainvoke(self, input, config, **kwargs)
   [2534](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2534) try:
   [2535](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2535)     for i, step in enumerate(self.steps):
-> [2536](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2536)         input = await step.ainvoke(
   [2537](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2537)             input,
   [2538](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2538)             # mark each step as a child run
   [2539](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2539)             patch_config(
   [2540](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2540)                 config, callbacks=run_manager.get_child(f"seq:step:{i+1}")
   [2541](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2541)             ),
   [2542](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2542)         )
   [2543](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2543) # finish the root run
   [2544](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:2544) except BaseException as e:

File [~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4537](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4537), in RunnableBindingBase.ainvoke(self, input, config, **kwargs)
   [4531](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4531) async def ainvoke(
   [4532](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4532)     self,
   [4533](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4533)     input: Input,
   [4534](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4534)     config: Optional[RunnableConfig] = None,
   [4535](https://file+.vscode-resource.vscode-cdn.net/Users/chrollolucifer/Desktop/python/lang-last/~/Desktop/python/lang-last/.venv/lib/python3.11/site-packages/langchain_core/runnables/base.py:4535)     **kwargs: Optional[Any],
...
    [179](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:179)     """
--> [180](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:180)     raise TypeError(f'Object of type {o.__class__.__name__} '
    [181](https://file+.vscode-resource.vscode-cdn.net/opt/anaconda3/lib/python3.11/json/encoder.py:181)                     f'is not JSON serializable')

TypeError: Object of type ModelMetaclass is not JSON serializable
Output is truncated. View as a [scrollable element](command:cellOutput.enableScrolling?c5204a57-eead-4ed7-b36f-03dffdb97387) or open in a [text editor](command:workbench.action.openLargeOutput?c5204a57-eead-4ed7-b36f-03dffdb97387). Adjust cell output [settings](command:workbench.action.openSettings?%5B%22%40tag%3AnotebookOutputLayout%22%5D)...

Description

I'm trying to run an example from langgraph using local Ollama. The only way I've found to use with structured output is by using Ollama Functions but it trows an error.

System Info

langchain==0.1.17
langchain-community==0.0.37
langchain-core==0.1.52
langchain-experimental==0.0.58
langchain-groq==0.1.3
langchain-openai==0.1.6
langchain-text-splitters==0.0.1
langchainhub==0.1.15

platform: macOS

Same here.
Tried to run code from this notebook and got the same problem with serialization https://github.com/langchain-ai/langgraph/blob/main/examples/reflexion/reflexion.ipynb?ref=blog.langchain.dev


---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[37], line 2
      1 example_question = "Why is reflection useful in AI?"
----> 2 initial = first_responder.respond([HumanMessage(content=example_question)])

Cell In[34], line 30, in ResponderWithRetries.respond(self, state)
     28 response = []
     29 for attempt in range(3):
---> 30     response = self.runnable.invoke(
     31         {"messages": state}, {"tags": [f"attempt:{attempt}"]}
     32     )
     33     try:
     34         self.validator.invoke(response)


...

File /opt/conda/lib/python3.10/json/encoder.py:257, in JSONEncoder.iterencode(self, o, _one_shot)
    252 else:
    253     _iterencode = _make_iterencode(
    254         markers, self.default, _encoder, self.indent, floatstr,
    255         self.key_separator, self.item_separator, self.sort_keys,
    256         self.skipkeys, _one_shot)
--> 257 return _iterencode(o, 0)

TypeError: Object of type ModelMetaclass is not JSON serializable

Packages:
langchain 0.1.20
langchain-anthropic 0.1.11
langchain-core 0.1.52
pydantic 2.7.1
pydantic_core 2.18.2

MacOS as well.