google-deepmind / reverb

Reverb is an efficient and easy-to-use data storage and transport system designed for machine learning research

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Reverb adder performance decreases over time

ostap-viniavskyi opened this issue · comments

Hi!
I'm using acme library for training the R2D2 agent on Atari games. I'm training on Vertex AI with 128 actor nodes, 1 reverb node, and 1 learner node. After some time of training (~20 hours), the utilisation of CPU cores on actors decreases and the speed of experience collection decreases as well. After some investigation, I've found out that its the function that adds experience to the reverb taking more and more time as the training progresses.

Below you can see the CPU utilisation for all the nodes participating in the training. The green curve corresponds to evaluator (which is almost the same as actor, except for the fact it skips the step of adding experience to reverb).
image

I'm using:
dm-acme==0.4.0
dm-reverb==0.7.0

I use SequenceAdder for adding the experience, and SampleToInsertRatio for limiting the number of insertions compared to the number of samples on learning. Min size of reverb table is 6250 and max size is 100k.

How does the Learner's load looks like? It might be the case that rate limiter starts blocking inserts over time.

The GPU load on learner decreases at the same time as on actors:
image
The CPU load decreases as well:
image
Also, the speed of learning decreases with time:
image

It is hard to tell from these plots where the problem is. It could be on Reverb side (but I doubt it), or it could be on the Learner side (for example if Learner after each step computes some statistics over all already executed episodes... is such stats are computed in a single thread then CPU usage of the Learner would be low). Can you make the Learner just sample data from Reverb (disable training logic) and see if the problem goes away?

To make sure - you are using JAX R2D2?