You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi Dingfan, Thank you for your work!
I'm a postgraduate student in Beihang university.Recently I've read your paper and tried to find out how the Mechanism affect the gradient in the backward process.
In your code ( source/main.py line294) , you defined the dynamic_hook_function = dp_conv_hook, which means you changed the dummy_hook to dp_conv_hook to let the DP Mechanism( clip-gradient and add noise) work.
However, I noticed that in line301-302 p.requires_grad = False, you actually set the netD parameters fixed and it seems the dp_conv_hook will not modify the gradients in the backward process, so I wander how could the hook take effect?Or what should I do to let the dp_conv_hook work?
Thank you !
The text was updated successfully, but these errors were encountered:
Hi, thanks for your question. Briefly speaking, setting requires_grad=False will not stop the gradient from passing through non-leaf tensors (See the pytorch document for more details).
Also, we have uploaded a colab notebook in the repository which 'verifies' that the gradient of the generator is indeed changed (when setting requires_grad=False for the discriminator parameters) as we expected. You can play with it and check the results for now. We may include the Unittest later.
Thanks a lot! Your notebook really helps!
I really appreciate your patience and detailed step-by-step demonstration, which means a lot to me!
I sincerely wish you all the best in your research in Germany!
Hi Dingfan, Thank you for your work!
I'm a postgraduate student in Beihang university.Recently I've read your paper and tried to find out how the Mechanism affect the gradient in the backward process.
In your code ( source/main.py line294) , you defined the
dynamic_hook_function = dp_conv_hook
, which means you changed the dummy_hook to dp_conv_hook to let the DP Mechanism( clip-gradient and add noise) work.However, I noticed that in line301-302
p.requires_grad = False
, you actually set the netD parameters fixed and it seems the dp_conv_hook will not modify the gradients in the backward process, so I wander how could the hook take effect?Or what should I do to let the dp_conv_hook work?Thank you !
The text was updated successfully, but these errors were encountered: