Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

make theano.grad() build a GradOp and unfold it during compilation #4452

Open
nouiz opened this issue May 2, 2016 · 0 comments
Open

make theano.grad() build a GradOp and unfold it during compilation #4452

nouiz opened this issue May 2, 2016 · 0 comments

Comments

@nouiz
Copy link
Member

nouiz commented May 2, 2016

This would help some usecases like:

  • Taking the grad on an unmerged graph.
    • Optimizing it well in all cases would request the lift of the elemwise add on all existing op. Which is prohibitive.
    • unmerged graph happen sometimes with clones when people try to make multiple replacements.
  • We have opt to take the unfolded softmax graph and put the op. But with this change, we won't need an opt to recognize the unfolded grad of softmax to the softmaxgrad (we don't have it now). So this would request less optimization.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant