You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
From my perspective, the shape of 'grad_output' should be broadcasted into a_shape.
Although it can still pass task3_1, I'm not sure of its accuracy in more complex situations.
@staticmethoddefbackward(ctx, grad_output):
a_shape, dim=ctx.saved_valuesifdimisNone:
out=grad_output.zeros(a_shape)
out._tensor._storage[:] =grad_output[0]
returnoutelse:
# START Code Updatereturngrad_output#should be replaced by add_zip(grad_output,zeros(a_shape)) ## END Code Update
The text was updated successfully, but these errors were encountered:
I just realized the case 2 in function 'expand' is exactly to tackle the problem aroused by the inconsistence between origin shape and gradient shape which is caused by the return value.
# Case 2: Backward is a smaller than self. Broadcast up.true_shape=TensorData.shape_broadcast(self.shape, other.shape)
buf=self.zeros(true_shape)
self.backend._id_map(other, out=buf)
ifself.shape==true_shape:
returnbuf
From my perspective, the shape of 'grad_output' should be broadcasted into a_shape.
Although it can still pass task3_1, I'm not sure of its accuracy in more complex situations.
The text was updated successfully, but these errors were encountered: