You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from autograd import grad
import autograd.numpy as np
import copy
def summation_no_copy(x):
sum = 0
for i in range(1, 3):
sum += i * np.sum(x)
return sum
def summation_copy(x):
sum = 0
for i in range(1, 3):
sum += i * np.sum(copy.deepcopy(x))
return sum
x = np.array([1, 2, 3, 4, 3.5, 920, 0])
grad_copy = grad(summation_copy)
grad_no_copy = grad(summation_no_copy)
print(f'with deepcopy: {grad_copy(x)}')
print(f'without deepcopy: {grad_no_copy(x)}')
The output is:
with deepcopy: [1. 1. 1. 1. 1. 1. 1.]
without deepcopy: [3. 3. 3. 3. 3. 3. 3.]
I'm not sure whether this is a bug, or if this simply isn't supported, but it seems like the gradient calculation is broken after the first deepcopy (i.e. doesn't consider any of the following deepcopies). I need the gradient of a more complicated function which requires the use of deepcopy: how can I go about getting it?
The text was updated successfully, but these errors were encountered:
Consider the following :
The output is:
I'm not sure whether this is a bug, or if this simply isn't supported, but it seems like the gradient calculation is broken after the first deepcopy (i.e. doesn't consider any of the following deepcopies). I need the gradient of a more complicated function which requires the use of deepcopy: how can I go about getting it?
The text was updated successfully, but these errors were encountered: