ucbrise / actnn

ActNN: Reducing Training Memory Footprint via 2-Bit Activation Compressed Training

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

how to deploy actnn by libtorch?

lucify123 opened this issue · comments

It's really an interesting work!I just wonder how can we deploy actnn using libtorch. With libtorch or onnx( from python to c++),it will make actnn more useful.Thank you~

Thanks for your interest in our work. ActNN is a library for reducing training memory footprint by compressing the saved activations in each layer. At the inference time, the layer's activations won't be saved anyways, since we don't need to compute the gradients. Therefore, ActNN won't reduce the memory footprint at inference time.