You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Some context: in line 179 of the code, we have param.requires_grad_(False). I'm a bit confused why this needs to be set to false. When I try to reproduce this code in a different setting, my loss does not decrease. However, when param.requires_grad_(True), the loss does decrease. Either way, I'm unclear why it should matter because in the optimizer only intrinsic_parameter and intrinsic_said are being updated.
The text was updated successfully, but these errors were encountered:
Yes, only intrinsic_parameter and intrinsic_said should be updated. When you set the param.grad to True but it is not added to the optimizer, does the value of param change during training?
Some context: in line 179 of the code, we have param.requires_grad_(False). I'm a bit confused why this needs to be set to false. When I try to reproduce this code in a different setting, my loss does not decrease. However, when param.requires_grad_(True), the loss does decrease. Either way, I'm unclear why it should matter because in the optimizer only intrinsic_parameter and intrinsic_said are being updated.
The text was updated successfully, but these errors were encountered: