xl0 / lovely-tensors

Tensors, ready for human consumption

Home Page:https://xl0.github.io/lovely-tensors

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Is it possible to display a tensor allocated cpu/gpu memory in Kb, Mb or Gb?

VlSomers opened this issue · comments

The question is in the title, thanks for your help!

Hi.

Do you mean just tensor.numel() times dtype size? This should be pretty straight-forward.

If you mean, how much memory is being allocated on the GPU, I don't know if there is a precise way to tell. Do you know one?
I imagine there may be a lot of allocations that are hard to pin to a given tensor.

"Do you mean just tensor.numel() times dtype size? " -> yes something like this, it would be nice to have an option to display that information in the tensor "summary"! Then it would be nice to have the "device" information is also displayed next to it

I can add this as a config option that is off by default.

How do you feel about tensor[3, 196, 196] n=115248 (450 Kb) x∈[-2.118, 2.640] μ=-0.388 σ=1.073

That would be nice, what are your thoughts on this?

I feel it might be a bit excessive to have on by default, but I'll provide an easy way to enable it.

Thank you for your quick answers!

tensor[3, 196, 196] n=115248 (0.4Mb) x∈[-2.118, 2.640] μ=-0.388 σ=1.073

@VlSomers , sorry about the delay. I actually like how the result looks, so it's on by default if more than 1 kb of memory is consumed.

To adjust this limit, use set_config(show_mem_above=x) or context manager with config(show_mem_above=x): ...

Set to 0 to always display it, and to torch.inf to disable it completely.

Ok, thanks a lot!