You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I think that we should freeze the value of z_star_p by using z_star_p.detach().
In the second stage, we take advantage of back-propagation. Specifically, the updated gradient with
respect to φθ, that is z_∗, is back-propagated through the network to compute the derivatives with
respect to the parameters.
Please let me know what do you think.
The text was updated successfully, but these errors were encountered:
Hello @Kaixhin,
ACER/train.py
Line 97 in f22b07c
I think that we should freeze the value of
z_star_p
by usingz_star_p.detach()
.Please let me know what do you think.
The text was updated successfully, but these errors were encountered: