-
Notifications
You must be signed in to change notification settings - Fork 422
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test data leaked into meta training #3
Comments
NO. |
Thank you for your response! |
Yes, the meta_optim.step() will update \theta parameters only. The updates of fast_weights will be ignored in sub-stage of test. |
At the meta-testing phrase, we need to Do you think there is any need to take out |
In stead of
|
Oh, maybe u r right. |
I have already tried and it turned out the optimization step is required to be left out in meta testing. Otherwise it would give overly optimistic results for be trained on meta test sets. |
Ok, thx. Feel free to submit a PR. |
Hi, all. Please git pull to get the latest version which solved test data leaking bugs. |
In the
forward()
function ofmaml.py
(see here), the meta optimizer steps are taken no matter the input data is for training or for testing.This results in the leakage of testing examples to the meta-learner which needs to be guarded by the
training
flag.The text was updated successfully, but these errors were encountered: