django / channels

Developer-friendly asynchrony for Django

Home Page:https://channels.readthedocs.io

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Small messages are getting concatenated

PositivPy opened this issue · comments

The following sends all the small message chunks in one go. I want them sent seperatly.

consumers.py

import openai
import asyncio
from channels.generic.websocket import AsyncWebsocketConsumer

class AIConsumer(AsyncWebsocketConsumer):
async def connect(self):
print('Connected')
await self.accept()

async def disconnect(self, close_code):
    pass

async def receive(self, text_data=None, bytes_data=None):
    # Set your OpenAI API key and custom base URL
    openai.api_key = "YOUR_OPENAI_API_KEY"
    custom_base_url = "http://70672e1fb64c2dc61c3f4821d103339f.serveo.net/v1"
    openai.api_base = custom_base_url

    # Example prompt to send to your self-hosted model
    prompt = text_data  # Received prompt from WebSocket

    # Make a streaming request using the custom URL
    response_stream = openai.Completion.create(
        model="text-davinci-003",  # Choose the engine you've hosted
        prompt=prompt,
        max_tokens=50,  # Adjust the token length of each response
        stream=True  # Enable streaming
    )

    # Stream and handle the responses
    for response in response_stream:
        answer = response.choices[0].text.strip()
        await self.send(text_data=answer)

Assuming it's working with curl using the --no-buffer flag, your web server is likely buffering the response. You'll need to look into that. (It's not something I can help you with here particularly)