Performance issue with version 0.3.5 of async-pool
claracodes opened this issue · comments
In our Rails app we used to make 1000 http requests in 15 seconds.
After updating to Rails 6.1 the same requests now takes 75-90 seconds and we didn't change anything from our side.
I played around with the versions, downgrading every gem (async-http
, async
, async-pool
) and it appears to be the upgrade from async-pool
from 0.3.3
to 0.3.5
.
Old version of the async gems from the Gemfile:
async (1.27.0)
console (~> 1.10)
nio4r (~> 2.3)
timers (~> 4.1)
async-http (0.53.1)
async (~> 1.25)
async-io (~> 1.28)
async-pool (~> 0.2)
protocol-http (~> 0.21.0)
protocol-http1 (~> 0.13.0)
protocol-http2 (~> 0.14.0)
async-io (1.30.1)
async (~> 1.14)
async-pool (0.3.3)
async (~> 1.25)
New versions from Gemfile:
async (1.28.9)
console (~> 1.10)
nio4r (~> 2.3)
timers (~> 4.1)
async-http (0.54.1)
async (~> 1.25)
async-io (~> 1.28)
async-pool (~> 0.2)
protocol-http (~> 0.21.0)
protocol-http1 (~> 0.13.0)
protocol-http2 (~> 0.14.0)
async-io (1.30.2)
async (~> 1.14)
async-pool (0.3.5)
async (~> 1.25)
Hmmm interesting. We added some logic to minimise the number of active connections, but maybe this has impacted performance in some cases. I'll investigate.
I am trying to reproduce this regression.
I have some specs which stress the connection pool on purpose.
# async-pool 0.3.3
> bundle exec rspec -e "can handle many simultaneous requests"
Async::HTTP::Protocol::HTTP10
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(63/∞) 0/1/29;0/1/28;0/1/28;0/1/29;0/1/27;0/1/28;0/1/28;0/1/27;0/1/27;0/1/27;0/1/25;0/1/26;0/1/24;0/1/24;0/1/25;0/1/23;0/1/23;0/1/22;0/1/22;0/1/22;0/1/21;0/1/21;0/1/21;0/1/21;0/1/20;0/1/18;0/1/18;0/1/18;0/1/19;0/1/17;0/1/17;0/1/17;0/1/14;0/1/14;0/1/15;0/1/15;0/1/13;0/1/12;0/1/12;0/1/12;0/1/12;0/1/11;0/1/10;0/1/10;0/1/10;0/1/10;0/1/9;0/1/8;0/1/8;0/1/7;0/1/7;0/1/7;0/1/6;0/1/6;0/1/6;0/1/5;0/1/5;0/1/4;0/1/3;0/1/3;0/1/2;0/1/1;0/1/1>
Duration = 0.29
Async::HTTP::Protocol::HTTP11
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(63/∞) 0/1/29;0/1/28;0/1/28;0/1/29;0/1/27;0/1/28;0/1/28;0/1/27;0/1/27;0/1/27;0/1/25;0/1/26;0/1/24;0/1/24;0/1/25;0/1/23;0/1/23;0/1/22;0/1/22;0/1/22;0/1/21;0/1/21;0/1/21;0/1/21;0/1/20;0/1/18;0/1/18;0/1/18;0/1/19;0/1/17;0/1/17;0/1/17;0/1/14;0/1/14;0/1/15;0/1/15;0/1/13;0/1/12;0/1/12;0/1/12;0/1/12;0/1/11;0/1/10;0/1/10;0/1/10;0/1/10;0/1/9;0/1/8;0/1/8;0/1/7;0/1/7;0/1/7;0/1/6;0/1/6;0/1/6;0/1/5;0/1/5;0/1/4;0/1/3;0/1/3;0/1/2;0/1/1;0/1/1>
Duration = 0.27
Async::HTTP::Protocol::HTTP2
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(1/∞) 0/128/1000>
Duration = 0.46
Compared to:
# async-pool 0.3.5
> bundle exec rspec -e "can handle many simultaneous requests"
Async::HTTP::Protocol::HTTP10
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(9/∞) 0/1/11;0/1/10;0/1/9;0/1/8;0/1/8;0/1/3;0/1/2;0/1/1;0/1/1>
Duration = 0.37
Async::HTTP::Protocol::HTTP11
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(9/∞) 0/1/11;0/1/10;0/1/9;0/1/8;0/1/8;0/1/3;0/1/2;0/1/1;0/1/1>
Duration = 0.31
Async::HTTP::Protocol::HTTP2
behaves like Async::HTTP::Protocol
with working server
using GET method
can handle many simultaneous requests
Pool: #<Async::Pool::Controller(1/∞) 0/128/1000>
Duration = 0.47
In the latter example, the connection pool is a little bit smaller because after the connections are closed, some are released according to an internal logic to avoid too many connections being retained. There is always a trade off between retaining a large number of connections vs resource usage vs acquire/release latency.
That being said, I don't think this is the cause of your problems, at least not directly.
Are you able to collect data with the following environment set?
CONSOLE_DEBUG=Async::HTTP::Client,Async::Pool::Controller
Okay, I think I narrowed it down to this new behaviour, if the connection speed is slow:
Removing the overflowing check:
With the current master:
So, it's much slower because it's not able to reuse connections as efficiently because it tries to close connections when there is a large number of available connections.
Ok, I have to admit, that I don't understand the internal specifics enitrely, but is the new behaviour desired or will you change it back to it being able to reuse connections efficiently? Or is there a way for me to change a setting or something like that? Sorry if that doesn't make sense.
Here is the data you required (tested with 300 http requests)
async-pool
on 0.3.3
0.0s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:31 -0800]
| No available resources, allocating new one...
0.0s debug: Async::HTTP::Client [oid=0x12a34] [pid=33132] [2021-02-24 08:49:31 -0800]
| Making connection to #<Async::HTTP::Endpoint https://xxxxx {}>
0.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:31 -0800]
| No available resources, allocating new one...
0.23s debug: Async::HTTP::Client [oid=0x12a34] [pid=33132] [2021-02-24 08:49:31 -0800]
| Making connection to #<Async::HTTP::Endpoint https://xxxxx {}>
0.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:31 -0800]
...
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9a17cdbf0>
7.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9af919b90>
7.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9afc971c8>
7.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9a5d92190>
7.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9a17cdbf0>
7.23s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9abac7518>
7.3s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9a5d92190>
7.3s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9afc971c8>
7.3s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9af919b90>
7.3s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9abac7518>
7.3s debug: Async::Pool::Controller [oid=0x12a20] [pid=33132] [2021-02-24 08:49:38 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007ff9a17cdbf0>
and async-pool
on 0.3.5
0.0s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:54:25 -0800]
| No available resources, allocating new one...
0.0s debug: Async::HTTP::Client [oid=0x12a34] [pid=33329] [2021-02-24 08:54:25 -0800]
| Making connection to #<Async::HTTP::Endpoint https://xxxxx {}>
0.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:54:25 -0800]
| No available resources, allocating new one...
0.21s debug: Async::HTTP::Client [oid=0x12a34] [pid=33329] [2021-02-24 08:54:25 -0800]
| Making connection to #<Async::HTTP::Endpoint https://xxxxx {}>
0.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:54:25 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007feebe6b3c08>
0.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:54:25 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feebe6b3c08>
0.41s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:54:26 -0800]
| No available resources, allocating new one...
0.41s debug: Async::HTTP::Client [oid=0x12a34] [pid=33329] [2021-02-24 08:54:26 -0800]
| Making connection to #<Async::HTTP::Endpoint https://xxxxx {}>
...
37.08s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec95d28d8>
37.08s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7debc10>
37.09s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Retire #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7debc10>
37.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec9549880>
37.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7f6cb70>
37.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec95d28d8>
37.21s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Wait for resource -> #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7ceaf28>
37.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec9549880>
37.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec95d28d8>
37.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7f6cb70>
37.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Reuse #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7ceaf28>
37.28s debug: Async::Pool::Controller [oid=0x12a20] [pid=33329] [2021-02-24 08:55:02 -0800]
| Retire #<Async::HTTP::Protocol::HTTP1::Client:0x00007feec7ceaf28>
I ommitted some lines, let me know if you need them
Are you making 1000 requests to the same server?
The new behaviour has some desirable traits.
The old behaviour also has some desirable traits, so I'll try to find a better middle-ground.
Yes we do atm...
Thanks you!
Regarding the behaviour, do you expect to limit the number of simultaneous requests to the same service?
i.e. are you intending to make 1000 concurrent connections?
No, there is no need to limit the number of simultaneous requests
With the latest release of async-pool
, the logic which was slowing this use case is now reverted. I believe this issue should be fixed. Feel free to reopen if that's not the case.