Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add selu activation function #10889

Closed
wants to merge 1 commit into from
Closed

add selu activation function #10889

wants to merge 1 commit into from

Conversation

lakshayg
Copy link

resolves #10888

This pullrequest changes

  • adds the selu activation function

I have not yet added the test data and request the devs to guide me on how to do it.

@dkurt dkurt self-assigned this Feb 17, 2018
@@ -1553,6 +1553,14 @@ void TFImporter::populateNet(Net dstNet)
layer_id[name] = id;
connectToAllBlobs(layer_id, dstNet, parsePin(layer.input(0)), id, layer.input_size());
}
else if (type == "Selu")
{
layerParams.set("scale", 1.0507009873554804934193349852946f);
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please provide an origin of these magic values? Are they results of some optimization problem that we can reproduce locally or they are just some irrational numbers that have too complicated formula? This way it'd be nice to put a reference to a some proof that they are optimal.

@lakshayg
Copy link
Author

lakshayg commented Feb 27, 2018 via email

@dkurt
Copy link
Member

dkurt commented Mar 21, 2018

@lakshayg, thank you for your contribution! Unfortunately, I did not find any strong proofs why this activation is so useful. I think we can add this layer later, when it'll be more popular. For now you can test custom layers creation from PR #11129.

@lakshayg
Copy link
Author

lakshayg commented Mar 22, 2018 via email

@dkurt
Copy link
Member

dkurt commented Apr 25, 2018

Please look at custom layers registration: https://docs.opencv.org/master/dc/db1/tutorial_dnn_custom_layers.html.

@dkurt dkurt closed this Apr 25, 2018
@AlvarHHM
Copy link

AlvarHHM commented Oct 4, 2018

@lakshayg, thank you for your contribution! Unfortunately, I did not find any strong proofs why this activation is so useful. I think we can add this layer later, when it'll be more popular. For now you can test custom layers creation from PR #11129.

Both pytorch, keras and tensorflow has native support for SELU, which is a strong evident that this activation layer should be popular enough for opencv to support it.

P.S. I would be grateful if someone can provide a custom layer implementation.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants