danielmiessler / fabric

fabric is an open-source framework for augmenting humans using AI. It provides a modular framework for solving specific problems using a crowdsourced set of AI prompts that can be used anywhere.

Home Page:https://danielmiessler.com/p/fabric-origin-story

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

[Question]: How to use ollama successfully?

bjcbusiness opened this issue · comments

What is your question?

After reading the documentation, I am still not clear how to get local llama models working.
I've had to cobble together an understanding based on reading github issues and older documentation. What I came to in the end is this:

~/.config/fabric/.env has

OPENAI_API_KEY=ollama
OPENAI_BASE_URL=http://127.0.0.1:11434/v1

ollama has only 1 model installed: mistral:instruct

I've tried numerous commands, including --remoteOllamaServer and they all lead to the same result:

Error: Client error '404 Not Found' for url 'http://127.0.0.1:11434/v1/models'

I've tried different BASE_URLs, like removing the v1 but it doesn't help. I've tried a command that requests a specific pattern and get the same result:

fabric --model mistral --pattern summarize --text "hello my name is ben" --remoteOllamaServer http://localhost:11434/v1
Error: Client error '404 Not Found' for url 'http://127.0.0.1:11434/v1/models'

I appreciate any help you can provide.

Hi @bjcbusiness. I was having issues myself - hopefully I can steer you right.

First things first, make sure you have the most recent version of ollama.

> ollama -v
ollama version is 0.1.32

I have no .env file configured - I commented out my OpenAI key.

I also do not specify a remote server - the --help text specifies that you should only do that when not running locally which you evidently are. the following: ONLY USE THIS if you are using a local ollama server in an non-deault location or port,.

This command worked for me:

> pbpaste | ~/.local/bin/fabric -m llama3:8b -p create_summary --stream
# IDENTITY and PURPOSE
As an expert content summarizer, I will condense the provided text into a Markdown-formatted summary.

# OUTPUT SECTIONS

## ONE SENTENCE SUMMARY:
The story concludes with Darrow's son Pax being born to him and Mustang, while the world is rebuilding from the aftermath of war and nuclear destruction.

...

Last thing that I'll note is that fabric --listmodels does not return a mistral result. I am unsure if fabric will automatically append :latest to the model name, but you may want to try specifying the latest tag to see if that clears up issues.

Wow! Who would have guessed "ignore everything, empty your .env and just refer to the local model."
This did work, and I'm thankful for it.

Documentation could use some update here though.

Absolutely - I am glad to have helped! I actually misspoke on what the documentation says about using the remote ollama config, but nonetheless - glad you're able to make use!