Bavarder / Bavarder

Chit-chat with an AI

Home Page:https://bavarder.codeberg.page

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Imported `.bin` models don't generate meaningful answers

tsilvs opened this issue · comments

Describe the bug
Any local model I choose answers me with Hello, I am Bavarder, a Chit-Chat AI instead of generating a response to my prompt.

To Reproduce
Steps to reproduce the behavior:

  1. Enable only local provider in Preferences, disable all others
  2. Download models supported by GPT4All (e.g. nous-hermes-13b.ggmlv3.q4_0.bin)
  3. Place or hardlink the .bin file in ~/.var/app/io.github.Bavarder.Bavarder/cache/bavarder/models directory
  4. Open Bavarder app
  5. Click on a cloud button to get a crossed cloud (would be nice to add mouse hover tooltips to it as well to know how to call it properly)
  6. Open Dot Menu, select your local model (e.g. nous-hermes-13b.ggmlv3.q4_0.bin)
  7. Create new chat
  8. Send a prompt
  9. Get an answer with Hello, I am Bavarder, a Chit-Chat AI

Expected behavior
When I send a prompt, I should receive an answer generated by a selected model.

Screenshots

Chat 1
Chat 3

Environment

io.github.Bavarder.Bavarder 1.0.0
Environment: GNOME
Gtk: 4.12.5
Python: 3.11.9
OS: Linux 6.8.10-200.fc39.x86_64 #1 SMP PREEMPT_DYNAMIC Fri May 17 21:20:15 UTC 2024
Providers: ['local']

Additional context

I already have a lot of GiBs of LLMs downloaded on my PC. My connection is metered & slow, storage is limited as well.

I can't afford downloading hundreds of GiBs of data again. It will be slow, expensive & polluting.