New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Selu Activation Function #1346

Merged
merged 11 commits into from Apr 7, 2018

Conversation

Projects
None yet
4 participants
@Prabhat-IIT
Contributor

Prabhat-IIT commented Mar 31, 2018

This is in continuation of #1287
I've corrected all the style issues

@Prabhat-IIT

This comment has been minimized.

Contributor

Prabhat-IIT commented Apr 1, 2018

@mlpack-jenkins test this please

*
* For more information, see the following paper.
*
* @code

This comment has been minimized.

@zoq

zoq Apr 3, 2018

Member

Let's put this into the class description, that way doxygen can pick this up for the documentation generation step.

* title = {Self-Normalizing Neural Networks},
* journal = {Advances in Neural Information Processing Systems},
* year = {2017}
* }

This comment has been minimized.

@zoq

zoq Apr 3, 2018

Member

Looks like the last } could be removed.

* NOTE: Use this constructor for SELU activation function.
*
*/

This comment has been minimized.

@zoq

zoq Apr 3, 2018

Member

Picky style comment, remove the empty line.

/*
* Simple SELU activation test to check whether the mean and variance remain
* invariant after passing normalized inputs through the function.
*

This comment has been minimized.

@zoq

zoq Apr 3, 2018

Member

Another picky style comment, remove the exta line here.

@zoq

zoq approved these changes Apr 5, 2018

Thanks for the contribution. I'll go ahead and merge this in 3 days to leave time for any other comments and I will fix some remaining minor style issues afterwards.

@rcurtin

rcurtin approved these changes Apr 6, 2018

Thanks for fixing the remaining issues. Looking forward to seeing this merged. 👍

}; // class ELU
using SELU = ELU<arma::mat, arma::mat>;

This comment has been minimized.

@rcurtin

rcurtin Apr 6, 2018

Member

Can we add a small block of documentation for this typedef please?

selu.Backward(std::move(input), std::move(error), std::move(derivatives));
BOOST_REQUIRE_LE(arma::as_scalar(arma::abs(arma::mean(derivatives) -
selu.Lambda()*(selu.Alpha()-1))), 10e-5);

This comment has been minimized.

@rcurtin

rcurtin Apr 6, 2018

Member

There are some horizontal whitespace issues here; if you can fix that it would be great: selu.Lambda() * (selu.Alpha() - 1). Also it seems to me like it would make more sense to use 1e-4 throughout instead of 10e-5.

@Prabhat-IIT

This comment has been minimized.

Contributor

Prabhat-IIT commented Apr 6, 2018

@mlpack-jenkins test this please

@Prabhat-IIT

This comment has been minimized.

Contributor

Prabhat-IIT commented Apr 7, 2018

Travis timed out :(

@zoq

This comment has been minimized.

Member

zoq commented Apr 7, 2018

Unfortunately, you can trigger the travis job, from within github; anyway, I think this looks good.

@zoq

This comment has been minimized.

Member

zoq commented Apr 7, 2018

With two approvals I'll merge this now, thanks @Prabhat-IIT for the contribution.

@zoq zoq merged commit ea63409 into mlpack:master Apr 7, 2018

4 of 5 checks passed

continuous-integration/travis-ci/pr The Travis CI build could not complete due to an error
Details
Memory Checks
Details
Static Code Analysis Checks Build finished.
Details
Style Checks Build finished.
Details
continuous-integration/appveyor/pr AppVeyor build succeeded
Details
@dakshitagrawal97

This comment has been minimized.

Contributor

dakshitagrawal97 commented Apr 7, 2018

Thanks @Prabhat-IIT @zoq for seeing this to completion! :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment