-
Notifications
You must be signed in to change notification settings - Fork 109
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unclear which transformers version should be used when testing Tatoeba #85
Comments
Thanks for the response! I ran into issues running the Nevertheless, I followed the lines you pointed at and install
|
Hmm, that's strange. I just checked the HuggingFace Transformers repo and |
Apologies, apparently I uncommented the import statement when I was trying to make the code run and forgot to put it back! Now the code starts running with this message:
But then it gives me this error message:
I'm not even sure why it is trying to use XLM-RoBERTa when I explicitly tried using multilingual BERT. |
I assume you're running the Edit: Running |
Oh thank you, that was actually really helpful! Now the code seems to be running without issues. I am closing the issue because it has been sorted. |
I installed all the necessary dependencies and tried running the Tatoeba task using
bash scripts/train.sh "bert-base-multilingual-cased" tatoeba
.However, I immediately ran into an ImportError:
Traceback (most recent call last): File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/evaluate_retrieval.py", line 39, in <module> from bert import BertForRetrieval File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/bert.py", line 4, in <module> from transformers.modeling_bert import BertModel, BertPreTrainedModel ModuleNotFoundError: No module named 'transformers.modeling_bert'
This is with transformers-4.17.0.
I tried downgrading transformers to version 3.5 and 2.0 but I am running into other issues then.
Traceback (most recent call last): File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/evaluate_retrieval.py", line 43, in <module> from xlm_roberta import XLMRobertaConfig, XLMRobertaForRetrieval, XLMRobertaModel File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/xlm_roberta.py", line 24, in <module> from roberta import ( File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/roberta.py", line 27, in <module> from transformers.modeling_bert import BertEmbeddings, BertLayerNorm, BertModel, BertPreTrainedModel, gelu ImportError: cannot import name 'BertLayerNorm' from 'transformers.modeling_bert' (/Users/marcellfekete/miniforge3/envs/rosetta/lib/python3.8/site-packages/transformers/modeling_bert.py)
This is with transformers 3.5.
Traceback (most recent call last): File "/Users/marcellfekete/PycharmProjects/xtreme/third_party/evaluate_retrieval.py", line 57, in <module> "xlmr": (XLMRobertaConfig, XLMRobertaModel, XLMRobertaTokenizer), NameError: name 'XLMRobertaTokenizer' is not defined
This is with transformers 2.0.
Do you have any advice? Which transformers version is recommended to run the tests?
I don't know if it matters but I am trying to run on Apple Silicon using the Rosetta layer (due to
faiss
not installing natively).Thank you!
The text was updated successfully, but these errors were encountered: