You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
defcall(self, X):
fortinrange(X.shape[0]):
self.ans=some_function(X[t], self.ans) # some_function is a parameterized operation# some more computation with self.ans# final step has a scalar loss function
would I get correct gradients of this whole process? Meaning - will the computation graph that autograd constructs have the value of self.ans stored per iteration and use it?
The text was updated successfully, but these errors were encountered:
One thing to keep in mind is that ans will come out boxed at the end:
In [1]: run issue67
[ 323.38327233 185.12997922 283.66369298 168.29411013 807.31095824]
Checking gradient of <bound method A.call of <__main__.A object at 0x10d2d9610>> at [-0.10688095 0.66919977 -0.45675244 -1.08241973 -0.91352716]
Gradient projection OK (numeric grad: -0.105982859395, analytic grad: -0.105982859699)
In [2]: print a.ans
Autograd ArrayNode with value [-1.005772 -1.005772 -1.005772 -1.005772 -1.005772] and 1 tape(s)
but its computation tape is completed and so it will act just like a regular array.
The code also works if the updated value of self.ans gets reused in future calls to call instead of getting reset to zeros like in the example I wrote. That just means the function changes every time you call it, which autograd can handle but quick_grad_check can't (because it invokes the function multiple times to check its numerical gradient):
Suppose I had a class method that like:
would I get correct gradients of this whole process? Meaning - will the computation graph that autograd constructs have the value of
self.ans
stored per iteration and use it?The text was updated successfully, but these errors were encountered: