-
Notifications
You must be signed in to change notification settings - Fork 2.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
R_op is ran on zero_grad nodes. #5792
Comments
Right, You can use |
@lamblin I've tried |
Indeed, ROp seems to ignore disconnected patterns, and still calls Prod.R_Op, which I did not expect. |
I've implemented successfully the Rop for
However, in my code I'm not using anywhere |
There seems to some problem with sampling however. It is in fact not coming from the Rop implementation by itself or the zero_grad, but something more subtle. The problem is that because of the Rop traversing everything and the Prod for the sampling being
Running this will raise an error. Commenting out the line with
Any suggestions? |
So, I keep getting this annoying error:
Because in the calculation of the
MRng
gaussian sampling there isProd
operator. I've tried to wrap the sampled random variables intozero_grad
, but Theano still tries to apply theR_op
to them. Currently the code looks like this:The graph output is:
As you can see all of the
Prod
operators are under themrg_uniform
which is in terms part of themrg_normal
and are all together underZeroGrad
. My question is how can I actually avoid this and tell Theano not to doThe text was updated successfully, but these errors were encountered: