ruecat / ollama-telegram

🦙 Ollama Telegram bot, with advanced configuration

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

llama3

gabyavra opened this issue · comments

trying to use it with llama3 but it does not reply to my messages

image Screenshot 2024-04-21 at 18 13 27

Any ideea how to debut this?

Also when I try to run it from docker (although there is an .env file with the api key for telegram)

image

Thank you.

It worked eventually with llama3, but I left llama-2 in .env file.
maybe here a try catch would be useful:

IMG_4250
IMG_4251
IMG_4252

The error you're getting in the message is already the result of a try-catch:

ollama-telegram/bot/run.py

Lines 256 to 261 in 279fac5

except Exception as e:
await bot.send_message(
chat_id=message.chat.id,
text=f"""Error occurred\n```\n{traceback.format_exc()}\n```""",
parse_mode=ParseMode.MARKDOWN_V2,
)

The model response is streamed to the user and chunks are constructed into the message response -- you likely received a complete response, in addition to the error message, which was caught on a single chunk.

If you don't want to see error messages in the telegram chat, you should be able to replace lines 257-261 with a log or print statement (anything other than a message) and restart your container

next step for this would be RAG support via telegram :)