Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add new Activation function: selu #6969

Closed
wants to merge 2 commits into from
Closed

add new Activation function: selu #6969

wants to merge 2 commits into from

Conversation

LiangXu123
Copy link

add new Activation function : selu

#read paper to know more detail:
Self-Normalizing Neural Networks:https : //arxiv.org/abs/1706.02515
#discussion:
https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/

#usage:
just like relu:
model.add(Dense(64, activation='relu'))
---model.add(Dense(64, activation='selu'))

experiment on mnist:(run keras-master\examples\mnist_cnn.py):
with relu:
Test loss: 0.025632923909
Test accuracy: 0.9915
replace relu in network with selu:
Test loss: 0.052635348707
Test accuracy: 0.9837

add new Activation function: selu
#read paper to know more detail:
Self-Normalizing Neural Networks:https : //arxiv.org/abs/1706.02515
#discussion:
https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/
usage:
just like relu:
model.add(Dense(64, activation='relu'))
---model.add(Dense(64, activation='selu'))
add new Activation function: selu
#read paper to know more detail:
Self-Normalizing Neural Networks:https : //arxiv.org/abs/1706.02515
#discussion:
https://www.reddit.com/r/MachineLearning/comments/6g5tg1/r_selfnormalizing_neural_networks_improved_elu/
usage:
just like relu:
model.add(Dense(64, activation='relu'))
---model.add(Dense(64, activation='selu'))
@LiangXu123
Copy link
Author

forget to add some test in backend_test.py , PR closed.

@LiangXu123 LiangXu123 closed this Jun 13, 2017
@ahundt
Copy link
Contributor

ahundt commented Jun 13, 2017

@cc786537662 You might want to look at #6969 before you add more changes

@Danielhiversen
Copy link
Contributor

@ahundt I think you mean #6924

@LiangXu123
Copy link
Author

nice job, that covers my PR,
thank you guys

@ahundt
Copy link
Contributor

ahundt commented Jun 13, 2017

Yeah that's what I meant thanks thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants