zyh-uaiaaaa / Erasing-Attention-Consistency

Official implementation of the ECCV2022 paper: Learn From All: Erasing Attention Consistency for Noisy Label Facial Expression Recognition

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Memory leak

kulich-d opened this issue · comments

Hi!
You have a memory leak during training here

It appends because
 print(correct_num) -> <add_backward>

For solving this problem, I used .detach():

loss = loss.detach().cpu()
_, predicts = torch.max(output.detach().cpu(), 1)
correct_num = torch.eq(predicts.detach().cpu(), labels.detach().cpu()).sum()

And memory stoped leak.

I attach a memory profile file.

Hi, what is the module name for getting profile file? Thanks.