Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]: Support Transformers 4.31.0 #3284

Closed
konrad0101 opened this issue Jul 21, 2023 · 11 comments · Fixed by #3289
Closed

[Bug]: Support Transformers 4.31.0 #3284

konrad0101 opened this issue Jul 21, 2023 · 11 comments · Fixed by #3289
Labels
bug Something isn't working

Comments

@konrad0101
Copy link

konrad0101 commented Jul 21, 2023

Describe the bug

When the python environment is updated to use the latest version of transformers, SequenceTagger fails due to an unexpected key in state_dict:

Error(s) in loading state_dict for SequenceTagger:
	Unexpected key(s) in state_dict: "embeddings.model.embeddings.position_ids".

To Reproduce

edit by @helpmefindaname
(installtransformers>=4.31.0)

from flair.models import SequenceTagger
model = SequenceTagger.load("ner-large")

Expected behavior

SequenceTagger should load without issue.

Logs and Stack traces

No response

Screenshots

No response

Additional Context

No response

Environment

Versions:

Flair

0.12.2

Pytorch

2.0.1+cu118

Transformers

4.31.0

GPU

True

@konrad0101 konrad0101 added the bug Something isn't working label Jul 21, 2023
@M4nouel
Copy link

M4nouel commented Jul 23, 2023

Same here, pinning transformers to 4.30.2 solve the problem.
So far, I've noticed similar recent issues (facebookresearch/DPR#165, huggingface/transformers#24996) but no fix in progress.
If I may, I would suggest to patch this issue by preventing transformers 4.31.0 to be used until a fix is released ?

M4nouel pushed a commit to M4nouel/flair that referenced this issue Jul 23, 2023
@helpmefindaname
Copy link
Collaborator

Hi @konrad0101 @M4nouel ,
thank you for reporting this.

If I understand this correct, this only applies to loading old models, loading models trained with the same version still works fine. Hence I am against preventing transformers 4.31.0 to be installed.

I will work on a propper fix, until then you can either:

  • Retrain your models using transformers 4.31.0
  • Not use transformers 4.31.0

@konrad0101
Copy link
Author

konrad0101 commented Jul 24, 2023

As long as inference is done using 4.31.0, the issue exists. It doesn't solve the problem to train on 4.31.0.

@aronnoordhoek
Copy link

aronnoordhoek commented Jul 24, 2023

Using

transformers=4.30.2 
flair=0.12.2 

for

SequenceTagger.load("flair/ner-dutch-large") 

still results in error on my side: never mind, a '^' slipped in when changing poetry toml therefore allowing an update to 4.31.0

Error(s) in loading state_dict for XLMRobertaModel:
	Unexpected key(s) in state_dict: "embeddings.position_ids"

@M4nouel
Copy link

M4nouel commented Jul 25, 2023

OK @helpmefindaname , I tried retraining an old model using transformers 4.31.0 and it seems to work. I will use transformers 4.30.2 for the models I can't or don't want to retrain and train/retrain the others without pinning the transformers lib. Thanks a lot !

@helpmefindaname
Copy link
Collaborator

#3289 is now out, which should fix the problem. Can you please check if it works for your case and report back if there any problems with it?

@M4nouel
Copy link

M4nouel commented Aug 2, 2023

Hi @helpmefindaname, I built your branch and tried it with an old model and everything looks great ! Thanks !

@konrad0101
Copy link
Author

@helpmefindaname Thanks! The changes work for me with the new transformers version.

@s1lvester
Copy link

Hey! @alanakbik & @helpmefindaname !

Thanks for the fix, this works fine for me. Any intentions on updating the requirements.txt in main or releasing a minor release?

@helpmefindaname
Copy link
Collaborator

Hi @s1lvester
we plan to release a new version of flair soon.
In the meantime you can always just update your own requirements.txt

@l4b4r4b4b4
Copy link

Actually I just encountered this bug again and reverte to use transformers 4.30 ...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

6 participants