darookee / chatgpt-telegram-bot

a Telegram bot that uses the ChatGPT to generate answers - written by ChatGPT

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

ChatGPT-Telegram-Bot

Written (mostly) by ChatGPT

The generated code

import telegram
import openai
import os

# Set up the Telegram bot and OpenAI API
bot = telegram.Bot(token=os.environ["TELEGRAM_BOT_TOKEN"])
openai.api_key = os.environ["OPENAI_API_KEY"]

# Define the function that handles incoming messages
def handle_message(update, context):
    # Get the message text and chat ID
    message_text = update.message.text
    chat_id = update.message.chat_id

    # Call the OpenAI API to generate a response
    response = openai.Completion.create(
        engine="davinci",
        prompt=message_text,
        max_tokens=1024,
        n=1,
        stop=None,
        temperature=0.5,
    )

    # Get the generated response from the OpenAI API
    generated_text = response.choices[0].text.strip()

    # Send the response back to the user
    bot.send_message(chat_id=chat_id, text=generated_text)

# Set up the Telegram bot to listen for messages
updater = telegram.ext.Updater(token=os.environ["TELEGRAM_BOT_TOKEN"], use_context=True)
dispatcher = updater.dispatcher
dispatcher.add_handler(telegram.ext.MessageHandler(telegram.ext.Filters.text & (~telegram.ext.Filters.command), handle_message))
updater.start_polling()

Can you write a Dockerfile for the program?

# Use an official Python runtime as a parent image
FROM python:3.9-slim-buster

# Set the working directory to /app
WORKDIR /app

# Copy the requirements file into the container
COPY requirements.txt .

# Install the required Python packages
RUN pip install --no-cache-dir -r requirements.txt

# Copy the Python script into the container
COPY your_script.py .

# Expose the port that the Telegram bot will use
EXPOSE 8443

# Set the environment variables
ENV TELEGRAM_BOT_TOKEN your_bot_token
ENV OPENAI_API_KEY your_openai_api_key

# Run the Python script when the container starts
CMD ["python", "your_script.py"]

Can you give me a docker-compose file for the container?

version: '3'

services:
  telegram-bot:
    image: your_image_name
    environment:
      TELEGRAM_BOT_TOKEN: your_bot_token
      OPENAI_API_KEY: your_openai_api_key

The generated instructions

It also supplied the commands you needed to run the program after each request.

Python

pip install python-telegram-bot openai
export TELEGRAM_BOT_TOKEN=token
export OPENAI_API_KEY=apikey
python your_script.py

Dockerfile

docker build -t your_image_name .
docker run -it --rm -p 8443:8443 -e TELEGRAM_BOT_TOKEN=your_bot_token -e OPENAI_API_KEY=your_openai_api_key your_image_name

docker-compose

docker-compose up

Wait. Did this work?!

No. The Dockerfile and docker-compose were mostly ok. It did not instruct me on how to create the requirements.txt (it did after asking about it) and, as it does not know things after about 2021, the python code for python-telegram-bot was just wrong as it seems to have changed a lot in that time. As I had to fix the python-telegram-bot code, I also added a few more features like logging and changed the used model and made tweaks to use it.

I think it's still very impressive that the generated code, at least in theory, did exactly what was asked for.

About

a Telegram bot that uses the ChatGPT to generate answers - written by ChatGPT


Languages

Language:Python 89.6%Language:Dockerfile 10.4%