-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
AttributeError: 'ORTModule' object has no attribute 'resize_token_embeddings' #53
Comments
replacing
src : huggingface/transformers#7146 does not give the above error message , I would like to know if this is a right fix. |
Hi Bhavya, In general ORTModule does not forward the attributes of the underlying model. For now, yes, this is the correct fix. However, this API is subject to change as exposing the attribute .module to get the underlying model has led to issues elsewhere. Likely the name will change to something bit less friendly, e.g. ._original_module. For HF-GPT2, you should be able to use the following repository as-is: In the above, the ORTModule is inserted in the huggingface trainer.py script itself: And there's another tweak here to ensure the DDP wrapping occurs correctly for >1 gpu: I run the model using the following launch command: Let me know if you have other issues. -- Suffian |
Thank you Suffian. I will try from https://github.com/microsoft/huggingface-transformers. |
Hi @bmedishe, is your issue resolved now? |
Yes @natke Thank you |
Great, thanks. I will close this issue. Please reach out again if you need to. |
Hi,
I am using ort to run transformers/examples/pytorch/language-modeling/run_clm.py (fine-tuning GPT-2 on WikiText-2, using the raw WikiText-2 no tokens were replaced before the tokenization). I am running it on rocm platform.
I edited the script like this
from torch_ort import ORTModule
I am getting this error
Could you kindly help me in resolving it
Thank you
Bhavya
The text was updated successfully, but these errors were encountered: