Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[WIP] SameDiff losses/differentiation overhaul #7452

Merged
merged 21 commits into from Apr 8, 2019

Conversation

@AlexDBlack
Copy link
Contributor

commented Apr 5, 2019

Allows users to specify which variables represent the losses. Also differentiates only the subset of the graph that is actually needed (more efficient). Consequently, it's now possible to train only the subset of the graph that the loss function depends on.

Fixes: #6876
Fixes: #7424 (though not in the proposed way)

Some other major changes:

  • VARIABLE type SDVariables can now only be floating point. Variable type means "trainable" but only FP are trainable by backprop, by definition.

@AlexDBlack AlexDBlack force-pushed the ab_samediff_losses branch from 9d2f394 to 65eec6a Apr 6, 2019

@treo
Copy link

left a comment

👍

Mostly flagging places where you forgot to change a println to a logger and nitpicking about typos

@AlexDBlack AlexDBlack force-pushed the ab_samediff_losses branch from 8b03058 to b55063e Apr 6, 2019

@AlexDBlack AlexDBlack force-pushed the ab_samediff_losses branch from 9fac06b to 346010b Apr 8, 2019

@AlexDBlack AlexDBlack merged commit 1c283e9 into master Apr 8, 2019

@AlexDBlack AlexDBlack deleted the ab_samediff_losses branch Apr 8, 2019

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
2 participants
You can’t perform that action at this time.