Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

About the Model vars(parameters) updating in Meta-training state #24

Open
mRSun15 opened this issue May 1, 2019 · 1 comment
Open

Comments

@mRSun15
Copy link

mRSun15 commented May 1, 2019

I see that you use a variable "vars" to store all the variables for the model, I think it's cool and clear, however, my questions is: "Why not using Pytorch's predefined function---- state_dict() and load_state_dict" to update the vars?

Thanks!

@IRNLPCoder
Copy link

IRNLPCoder commented Jul 11, 2019

My understanding is that load_state_dict results in changing the self.net.parameters(). The meta update is with respect to the base parameters self.net.parameters(). If the values change, the gradient is with respect to fast_weights (meta.py line 137).
Is it right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants