Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New Nd4j backprop ops for activations #211

Merged
merged 8 commits into from Sep 2, 2019

Conversation

@rnett
Copy link
Collaborator

commented Aug 31, 2019

Change nd4j ops to use the new(ish?) C++ backprop ops. Also have the DL4J activation functions use them.

Also added ThresholdRelu ops and fixed a few other usages of deprecated methods.

Existing derivative ops and DifferentialFunctionFactory methods have been deprecated but not removed, as there could conceivably be some scenario when you want the derivative without the backprop multiplication. I can delete/update these if its wanted.

rnett added 8 commits Aug 31, 2019
new (for java at least) backprop ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
update activation functions
Signed-off-by: Ryan Nett <rnett@skymind.io>
add differential functions for SameDiff
Signed-off-by: Ryan Nett <rnett@skymind.io>
deprecate old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
update correct old ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
update ops backprop to use new ops
Signed-off-by: Ryan Nett <rnett@skymind.io>
misc updates for deprecated functions (mostly Nd4j.rand w/ vararg shape)
Signed-off-by: Ryan Nett <rnett@skymind.io>
remove old imports
Signed-off-by: Ryan Nett <rnett@skymind.io>
@AlexDBlack
Copy link

left a comment

👍 Thanks

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.