dcmocanu / sparse-evolutionary-artificial-neural-networks

Always sparse. Never dense. But never say never. A Sparse Training repository for the Adaptive Sparse Connectivity concept and its algorithmic instantiation, i.e. Sparse Evolutionary Training, to boost Deep Learning scalability on various aspects (e.g. memory and computational time efficiency, representation and generalization power).

Home Page:https://www.nature.com/articles/s41467-018-04316-3

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

Provide Pytorch implementation

impredicative opened this issue · comments

Is it possible to provide a Pytorch implementation of a custom sparse layer? For motivation, consider https://github.com/AlliedToasters/synapses . I think you can improve on it with some of the finer points that you must know. It is standard practice for researchers these days to provide a Pytorch implementation. The goal is to be able to integrate an efficient implementation of this custom layer into other general Pytorch code. If not, feel free to close this issue. Thanks.

Thank you very much for finding interesting this work. I do agree with you that it would be very useful to have an efficient implementation of SET layers in PyTorch. At the same time, I believe that there isn't very efficient to have two parallel active projects with the same goal. I discussed with Michael Klear the author of the Synapses project and I will share with him my experience to help him in developing further Synapses. In time, I hope that this will lead to the point where we will have a very good integration of efficient implemented SET layers with general PyTorch code.