Skip to content
This repository has been archived by the owner on Aug 15, 2019. It is now read-only.

Add gradients to oneHot and argMax #1218

Merged
merged 6 commits into from
Aug 19, 2018

Conversation

indutny
Copy link
Contributor

@indutny indutny commented Aug 10, 2018

Description

Add gradients to oneHot and argMax. Both are not differentiable, and thus their gradients must be zero.


For repository owners only:

Please remember to apply all applicable tags to your pull request.
Tags: FEATURE

For more info see: https://github.com/tensorflow/tfjs/blob/master/DEVELOPMENT.md


This change is Reviewable

`oneHot` op is not differentiable and thus should have zero gradient
with the same shape and dtype as its input.
`argMax` op is not differentiable and thus should have zero gradient
with the same shape and dtype as its input.
`tf.split()`, `tf.unstack()` modify `begin` argument in a loop and thus
change its value for the backpropagation. Make a copy of `begin` in
`slice` to prevent such changes.
@indutny
Copy link
Contributor Author

indutny commented Aug 11, 2018

Sorry, added one more commit to the branch. I'm reusing this branch to fix my code, thus it is easier for to just pile up things here.

`argMin` op is not differentiable and thus should have zero gradient
with the same shape and dtype as its input.
@shaunwarman
Copy link

Would like to see this land as well 👍

`greaterEqual` op is not differentiable and thus should have zero
gradient with the same shape and dtype as its input.
@indutny
Copy link
Contributor Author

indutny commented Aug 17, 2018

Any chance this could be reviewed?

@nsthorat
Copy link
Contributor

Sorry for the delay, nice work!

I filed a bug to track the "correct" fix for this. For now, zeros are fine.

tensorflow/tfjs#619

@nsthorat nsthorat merged commit bc7dcbd into tensorflow:master Aug 19, 2018
@indutny
Copy link
Contributor Author

indutny commented Aug 20, 2018

👍 Thanks for review!

@indutny indutny deleted the feature/backprop-argmax-onehot branch August 20, 2018 05:18
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants