New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Possible bug in config file #17
Comments
Hi @dv-fenix for the TEACh paper, we are not doing any language pretraining so we do want the gradients to propagate through the language encoder. So this is the desired setting. |
Hi @aishwaryap Thank you for the clarification! |
Hi @dv-fenix, apologies for the late response but in that situation I would still propagate the gradients (and hence keep |
Oh cool! Thanks a lot @aishwaryap, I am closing this issue now. |
Hi @aishwaryap
I was revisiting the code for the E.T. baseline and there seems to be a bug in the config file for training the model:
teach/src/teach/modeling/ET/alfred/config.py
Line 183 in 5554f02
I believe it should be
detach_lang_emb = True
since we do not want to propagate the gradients through the look-up table or the language encoder.Please let me know your thoughts on this.
Thanks,
Divyam
The text was updated successfully, but these errors were encountered: