Stonesjtu / pytorch_memlab

Profiling and inspecting memory in pytorch

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Do you support profile and print the corresponding memory usage of each line while training?

GoingMyWay opened this issue · comments

Like https://pypi.org/project/memory-profiler/

Do you support profile and print the corresponding memory usage of each line while training?

Do you mean the profiling CPU memory at the same time?

Do you mean the profiling CPU memory at the same time?

No. I mean like https://pypi.org/project/memory-profiler/, it can print the memory usage while running like this

Line #    Mem usage  Increment   Line Contents
==============================================
     3                           @profile
     4      5.97 MB    0.00 MB   def my_func():
     5     13.61 MB    7.64 MB       a = [1] * (10 ** 6)
     6    166.20 MB  152.59 MB       b = [2] * (2 * 10 ** 7)
     7     13.61 MB -152.59 MB       del b
     8     13.61 MB    0.00 MB       return a

And, will you support this function and print the GPU memory usage like this? I think it can much easier to detect which function or which line of the code cost GPU memory usage.

I think https://github.com/Stonesjtu/pytorch_memlab#memory-profiler is what you are looking for.

Thanks. Does it support adding a property over a function to use it?

Sure it does, just like the other line-by-line profiling tools.

You can refer to the memory-profiler part for full documentation.

 class Net(torch.nn.Module):
    def __init__(self):
        super().__init__()
    @profile
    def forward(self, inp):
        #do_something

Closed as this feature is already implemented. Feel free to reopen if needed.