lava-nc / lava-dl

Deep Learning library for Lava

Home Page:https://lava-nc.org

Geek Repo:Geek Repo

Github PK Tool:Github PK Tool

How to approximate the non-linear activation function in ANN using lava.dl to extend its application?

zjuls opened this issue · comments

User story

As a user, I want to learn how to approximate the non-linear activation function in ANN using lava.dl to extend its application.

Conditions of satisfaction

  • An optional list of conditions that have to be fulfilled for this feature to be complete.
  • Users can choose either a sparse or normal system to adapt different applications

@zjuls can you explain more about the feature you are asking for? Have you looked at lava.dl.bootstrap?

Thanks for your time! I mean is there a way to use torch.nn.relu or other non-linear activation function in lava.dl? Or is there a chance to train a model using torch/tensorflow and implement it with lava on loihi?

@zjuls ANN activations like ReLU would not be the best use of Loihi hardware. What we really want is spiking neuron models that send sparse outputs which really utilize the principles of neuromorphic hardware.

That being said, lava.dl.slayer is built on top of PyTorch and you can use torch.nn.ReLU in place of existing neuron models and train your network. It just will not be efficient on Loihi. On the other hand, Sigma-Delta-ReLU spiking neurons resemble ReLU activation and use temporal redundancy to sparsify output messaging. It might interest you. Here is one example using it.