Help fix the bugs adding the LLM API providers config details right on the UI instead of or in addition to .toml or .env
arunkumarakvr opened this issue · comments
Describe the bug
A clear and concise description of what the bug is.
Help fix the bugs adding the LLM API providers config details right on the UI say under Settings instead of or in addition to .toml or .env and make it work seamlessly
To Reproduce
Steps to reproduce the behavior:
- Go to '...'
- Click on '....'
- Scroll down to '....'
- See error
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Additional context
Add any other context about the problem here.
Please provide more context
One who still doesn't have access to the OpenAI API or can't serve Ollama is not yet able to deploy locally to use this wrapper properly. That's where and why this wrapper becomes useless, as none of the LLMs APIs are particularly usable. Even when one tries to use the existing options, no workings yet. So you may know the alternative solution to this I guess. If so please fix it.
One who still doesn't have access to the OpenAI API or can't serve Ollama is not yet able to deploy locally to use this wrapper properly. That's where and why this wrapper becomes useless, as none of the LLMs APIs are particularly usable. Even when one tries to use the existing options, no workings yet. So you may know the alternative solution to this I guess. If so please fix it.
In conclusion, this wrapper is not working properly as neither OpenAI API nor Ollama becomes usable including the popular OpenAI compatible APIs due to incorrect config or what?
Please open only 1 issue #107, we can continue there