Ollama integration not working and Groq not responding
Himanshu-369 opened this issue · comments
Tried the previous solution
pip uninstall openui
pip install .
still the issue is same, now even the ollama integration is not working
and the GROQ_BASE_URL , GROQ_API_KEY makes no sense, because its making no change
You need to pull some model to make Ollama appear.
ollama pull llama2
after pulling rerun the server
You need to pull some model to make Ollama appear.
ollama pull llama2
after pulling rerun the server
Bro are you kidding I have models installed, someone would be an idiot if he is using ollama without models downloaded
Its not discord or Facebook someone would be Idiot kidding on GitHub.
Which OS you are working on?
Please send the snapshot of both Ollama running and models dropdown in OpenUI.
@Himanshu-369 it looks you don't have the most recent changes. The UI is making a request to /v1/ollama/tags
that's been replaced with /v1/models
on the main branch. I just pushed some new changes to main and upgraded the version. Try pulling again re-installing. If you check your chrome inspector window on the network tab you'll find /v1/models
and the response should contain a list of ollama models that are running.
P.S. I agree with @SFARPak. The tone in the previous message rubbed me the wrong way. We're all trying to help here, let's tone it down 🙏
I want to apologize for the tone in my previous message. It was not my intention to come across as harsh or disrespectful to anyone @SFARPak @vanpelt . I appreciate all the efforts everyone is putting in to help and collaborate.
No worries did you get to solve the issue?
I was asking you OS because on Windows environment variables are not working correctly, so you will need to install another package called python-decouple.
python -m pip install python-decouple