Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

question about intrinsic.py #31

Open
itaneja2 opened this issue Sep 29, 2023 · 1 comment
Open

question about intrinsic.py #31

itaneja2 opened this issue Sep 29, 2023 · 1 comment

Comments

@itaneja2
Copy link

Some context: in line 179 of the code, we have param.requires_grad_(False). I'm a bit confused why this needs to be set to false. When I try to reproduce this code in a different setting, my loss does not decrease. However, when param.requires_grad_(True), the loss does decrease. Either way, I'm unclear why it should matter because in the optimizer only intrinsic_parameter and intrinsic_said are being updated.

@dptam
Copy link
Collaborator

dptam commented Nov 21, 2023

Yes, only intrinsic_parameter and intrinsic_said should be updated. When you set the param.grad to True but it is not added to the optimizer, does the value of param change during training?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants