yhhhli / BRECQ

Pytorch implementation of BRECQ, ICLR 2021

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

What is the purpose for setting retain_graph=True?

un-knight opened this issue · comments

err.backward(retain_graph=True)

What is the purpose for setting retain_graph=True?

commented

Hello, @un-knight! When you train your model, the activations saves for each step to calculate gradiends. And they are stored until gradients are calculated (that is when you use err.backward() with default retain_graph=False). If you want to save calculated activations, you will set retain_grad=True.

@KLONNEX thanks for telling me that!