aio-libs / aiohttp-sse

Server-sent events support for aiohttp

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

app timeout after 7-8 requests

arjunpat opened this issue · comments

I am writing a basic pub/sub sse server with aioredis. Here is my code:

import asyncio
import json
from aiohttp import web
from aiohttp_sse import sse_response
from aiohttp.web import Application, Response
from datetime import datetime
from app import database, datastore


async def subscribe(request):
	channel = request.rel_url.query['channel']
	print('new sub') # 
	async with sse_response(request) as resp:
		redis_conn = await datastore.Datastore.redis_pool.acquire()

		await redis_conn.execute_pubsub('subscribe', channel)
		ch = redis_conn.pubsub_channels[channel]

		while await ch.wait_message():
			msg = await ch.get(encoding = 'utf-8')
			await resp.send(msg)

		await redis_conn.execute_pubsub('unsubscribe', channel)

		datastore.Datastore.redis_pool.release(redis_conn)

	return resp

async def publish(request):
	data = await request.post()
	channel = request.match_info['project_id']
	message = data.get('message')

	try:
		await datastore.Datastore.redis_pub.publish(channel, message) # just redis pub/sub stuff
		jsn = json.dumps({
			'success': True
		})
	except Exception as e:
		jsn = json.dumps({
			'success': False
		})

	return Response(text = jsn, content_type = 'application/json')


loop = asyncio.get_event_loop()

app = Application(loop = loop)
loop.run_until_complete(datastore.Datastore.setup(loop))

app.router.add_route('GET', '/sse/subscribe', subscribe)
app.router.add_route('POST', '/sse/publish/{project_id}', publish)

web.run_app(app, host = '127.0.0.1', port = 8080)

The thing is, after I subscribe maybe 7 or 8 times to the server (on any channel), any further subscriptions just timeout and produce a constant "pending" text on Google Chrome.
(I am running macOS High Sierra) and when I checked activity monitor, python is barley using any cpu (~.1% of cpu) and about 24 MB of ram on my 16 GB computer. Clearly, this isn't a problem with lack of resources.

Does anybody have any ideas as to what the problem is? No error messages are produced.

I suspect problem in handling redis connections. Have you tried to place debugger in handlers?

Yeah. On the subscribe function, there is a print statement that says ‘new sub.’ However, when the connection begins to timeout after 7-8 requests, it never prints ‘new sub’ which means the request isn’t even hitting the subscribe function. Thus, it is hard for me to believe that it could be a problem with redis.

You need to identify actual point where handler hangs, that print shows only that aiohttp starts actually execution.

I can only guess where is problem (since no full source code is present). I suspect that datastore.Datastore.redis_pool.acquire() never returns because no free redis connections in the pool. Anyway you need to refactor that code to guarantee that connection always released, otherwise it leaks connections in case of errors.

here is the datastore.py file

import aioredis
import aiomcache
import asyncio


class Datastore:

	@classmethod
	async def setup(self, loop):
		self.redis_pool = await aioredis.create_pool('redis://localhost', minsize = 35, maxsize = 40)
		self.redis_pub = await aioredis.create_redis('redis://localhost')
		#self.redissub = await aioredis.create_redis('redis://localhost')

		self.mc = aiomcache.Client('127.0.0.1', 11211, loop = loop)

I just upped the pool connection size to 35 (from 4) and the problem is still occurring. I don't think redis is the problem — after about 8 connections, it doesn't even REACH the subscribe function, let alone the redis_pool.acquire() statement. I know this because the print statement (which is before the redis_pool.acquire()) isn't even run.

Plus, after about 8ish connections the other static pages (pubPage and subPage functions) do not work either. I just get a timeout.

Ok. I just tested the following subscribe code, and the problem is still occurring — the problem is definitely not redis.

async def subscribe(request):
	print('new sub')
	async with sse_response(request) as resp:
		await asyncio.sleep(1)
		await resp.send('hello')

		await asyncio.sleep(4)
		await resp.send('hello')

		await asyncio.sleep(8)
		await resp.send('hello')


	return resp

After six connections (and exactly six every single time), all calls to the server (whether they be sse or a static file) are timeout-ed.
Is there some config file that I need to change to allow for more than six connections?

Ah looks like I see the problem, you are trying to connect form same browser? If so it is max number connections per domain limitation. Try several browsers.