langflow-ai / langflow

⛓️ Langflow is a visual framework for building multi-agent and RAG applications. It's open-source, Python-powered, fully customizable, model and vector store agnostic.

Home Page:http://www.langflow.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Ollama Error

VinojRaj opened this issue · comments

Discussed in #1804

Originally posted by VinojRaj April 30, 2024
I am new to Langflow and I was trying to use Llama2 through Ollama as the model but I am getting the following error:
ValueError: Error building vertex Ollama: ChatOllamaComponent.build() missing 1 required positional argument: 'base_url'

The base url is default on http://localhost:11434/

I have the same ...

it seems to be working now :)

Hi, I'm no expert in langflow but I had a similar issue. Can you describe your langflow deployment? Are you using docker or running on your local machine? Have you tried to make post requests using postman to Ollama?

In my case it was a network related error, and I was running langflow and ollama in separate docker containers. Nothing that creating a network and running both in the same network can't fix. But I would need more details to tell what could be happening on your side (reference images on the error would be much appreciated).

I bet this is an update that might have changed a bug or any other issue I am not familiar with. I have Ollama running on http://127.0.0.1:11434, which is typical, but yesterday there was an error, at midnight it changed ;). One remark, I am not running the app in a container...

I am running langflow locally on my computer. I have the same problem.
What exactly is the base_url?

same problem here. Trying to run a rag flow using ollama.

my friends,
here is what helped for me:
Bring both container (langflow and ollama) in the same network:

  1. create new network "m-net"
    docker network create my-net
  2. find container names (last column, withe ollama and docker_example-langflow-1):
    docker ps
  3. add first container
    docker network connect my-net ollama
    4.add second container
    docker network connect my-net docker_example-langflow-1
    5.use container name as baseUrl: e.g ‘ollama:11434’ for ollama
    hope that helps.

If you are running Ollama in the taskbar, exit out of it. Then go to terminal and start ollama server. It will work.