How to approximate the non-linear activation function in ANN using lava.dl to extend its application? #212
Replies: 3 comments
-
@zjuls can you explain more about the feature you are asking for? Have you looked at |
Beta Was this translation helpful? Give feedback.
-
Thanks for your time! I mean is there a way to use torch.nn.relu or other non-linear activation function in lava.dl? Or is there a chance to train a model using torch/tensorflow and implement it with lava on loihi? |
Beta Was this translation helpful? Give feedback.
-
@zjuls ANN activations like ReLU would not be the best use of Loihi hardware. What we really want is spiking neuron models that send sparse outputs which really utilize the principles of neuromorphic hardware. That being said, |
Beta Was this translation helpful? Give feedback.
-
User story
As a user, I want to learn how to approximate the non-linear activation function in ANN using lava.dl to extend its application.
Conditions of satisfaction
Beta Was this translation helpful? Give feedback.
All reactions