What is the purpose for setting retain_graph=True?
un-knight opened this issue · comments
Junxian Ye commented
Line 91 in 2888b29
What is the purpose for setting retain_graph=True
?
Ivan commented
Hello, @un-knight! When you train your model, the activations saves for each step to calculate gradiends. And they are stored until gradients are calculated (that is when you use err.backward() with default retain_graph=False). If you want to save calculated activations, you will set retain_grad=True.
Junxian Ye commented
@KLONNEX thanks for telling me that!