Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

about the activation function #7

Open
CHonChou opened this issue Nov 24, 2020 · 6 comments
Open

about the activation function #7

CHonChou opened this issue Nov 24, 2020 · 6 comments

Comments

@CHonChou
Copy link

Is it possible to replace the activation function RELU in snn.py with Prelu

@CHonChou
Copy link
Author

In other words, I am talking about the activation function in vgg_spiking.py

@nitin-rathi
Copy link
Owner

The activation function in vgg_spiking.py is a place holder. The actual activation is integrate-and-fire (IF) implemented in the LinearSpike/STDB class

@CHonChou
Copy link
Author

If I change the activation function used by ann from relu to Prelu and avgpool2d to maxpool2d, what should I change in vgg_spiking.py accordingly

@CHonChou
Copy link
Author

If I apply the batch_nromalization method in ann, do I need to make corresponding changes in snn

@nitin-rathi
Copy link
Owner

This ANN-SNN conversion method only works for ReLU, average pooling, and dropout. If you plan to include batch norm, you may need to define additional blocks for SNN.

@CHonChou
Copy link
Author

thank you a lot for your replay,i'll try to figure out this.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants