-
Notifications
You must be signed in to change notification settings - Fork 92
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Which line is the code of Meta learning in Decomposed meta NER #37
Comments
Hi @dongguanting, in fact, the whole logic can be found by analyzing the running script. |
Hi @iofu728, Thank you for your answer! But I still have another question which bothers me. I find that the model will backward twice during forward_meta function, namely inner update function and outer forward_wuqh. I think it may be related to MAML method, but why split into two processes to backward? |
Hi @dongguanting, this is how MAML does it. You can refer the MAML paper or other tutorials like AAAI21 MetaLearning Tutorial. In short, for the inner update part, the model fine-tunes specific task i data based on original model parameter |
Hi, I also have similar question. Is there any parameter to control meta-learning? I want to reproduce the results of 1) Ours w/o MAML. |
Hi @wjczf123, yeah, the code also supports full supervision mode(w/o MAML). You can set the |
Nice! Thank you very much. |
May I ask which line is the related code of the prototype network and MAML? I read your code carefully but do not find them.
The text was updated successfully, but these errors were encountered: