elnoro / tg-llm-wrapper

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

tg-llm-wrapper

Telegram bot wrapper for Ollama or OpenAI API.

Just a toy project to play with language models and experiment with prompts.

Installation

Prerequisites:

Also, either Ollama for self-hosted language models or OpenAI token.

For development:

Ollama:

make run/local

OpenAI:

make run/openai

Push Docker image (via Ko)

make deploy

Run in Podman

make podman/start

Configuration

See .env.dist for basic configuration. For advanced configuration, run make run/local --help. Alternatively, the following cli flags are available:

  -llm-engine string
        LLM engine to use (default "openai")
  -ollama-model string
        OLLama model to use. Choose here https://ollama.ai/library (default "openhermes")
  -ollama-url string
        OLLama url (default "http://localhost:11434")
  -openai-api-key string
        OpenAI API key
  -openai-debug
        Debug mode for OpenAI
  -openai-model string
        OpenAI model to use (default "gpt-4-1106-preview")
  -system-prompt string
        custom initial prompt
  -telegram-bot-token string
        Telegram bot token
  -telegram-debug
        Debug mode for Telegram
  -telegram-user-id int
        Telegram user id

Usage

After setup, initiate a conversation with your Telegram bot to interact with the LLM wrapper.

Acknowledgments

Kudos to the following projects:

About


Languages

Language:Go 93.2%Language:Makefile 4.4%Language:Dockerfile 2.3%