Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

3284 bug support transformers 4310 #3289

Merged
merged 6 commits into from
Aug 7, 2023

Conversation

helpmefindaname
Copy link
Collaborator

@helpmefindaname helpmefindaname commented Jul 31, 2023

Closes #3284

Can be tested by running python -c "from flair.models import SequenceTagger;model = SequenceTagger.load('ner-large')" with transformers 4.31.0 installed

@helpmefindaname helpmefindaname linked an issue Jul 31, 2023 that may be closed by this pull request
@helpmefindaname helpmefindaname force-pushed the 3284-bug-support-transformers-4310 branch from 2d87fb0 to c492abf Compare July 31, 2023 18:04
@alanakbik
Copy link
Collaborator

Works, thanks for fixing this @helpmefindaname!

@alanakbik alanakbik merged commit f54e6d6 into master Aug 7, 2023
1 check passed
@alanakbik alanakbik deleted the 3284-bug-support-transformers-4310 branch August 7, 2023 14:13
@stefan-it
Copy link
Member

stefan-it commented Sep 5, 2023

Hey @helpmefindaname and @alanakbik ,

I got an error when loading the flair/ner-german-large from model hub via:

from flair.models import SequenceTagger

tagger = SequenceTagger.load("flair/ner-german-large")

Then SemVer shows an error:

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
Cell In[5], line 1
----> 1 tagger_2 = SequenceTagger.load("flair/ner-german-large")

File ~/.venvs/dev/lib/python3.11/site-packages/flair/models/sequence_tagger_model.py:1027, in SequenceTagger.load(cls, model_path)
   1023 @classmethod
   1024 def load(cls, model_path: Union[str, Path, Dict[str, Any]]) -> "SequenceTagger":
   1025     from typing import cast
-> 1027     return cast("SequenceTagger", super().load(model_path=model_path))

File ~/.venvs/dev/lib/python3.11/site-packages/flair/nn/model.py:537, in Classifier.load(cls, model_path)
    533 @classmethod
    534 def load(cls, model_path: Union[str, Path, Dict[str, Any]]) -> "Classifier":
    535     from typing import cast
--> 537     return cast("Classifier", super().load(model_path=model_path))

File ~/.venvs/dev/lib/python3.11/site-packages/flair/nn/model.py:163, in Model.load(cls, model_path)
    161 if not isinstance(model_path, dict):
    162     model_file = cls._fetch_model(str(model_path))
--> 163     state = load_torch_state(model_file)
    164 else:
    165     state = model_path

File ~/.venvs/dev/lib/python3.11/site-packages/flair/file_utils.py:352, in load_torch_state(model_file)
    348 # load_big_file is a workaround byhttps://github.com/highway11git
    349 # to load models on some Mac/Windows setups
    350 # see https://github.com/zalandoresearch/flair/issues/351
    351 f = load_big_file(model_file)
--> 352 return torch.load(f, map_location="cpu")

File ~/.venvs/dev/lib/python3.11/site-packages/torch/serialization.py:809, in load(f, map_location, pickle_module, weights_only, **pickle_load_args)
    807             except RuntimeError as e:
    808                 raise pickle.UnpicklingError(UNSAFE_MESSAGE + str(e)) from None
--> 809         return _load(opened_zipfile, map_location, pickle_module, **pickle_load_args)
    810 if weights_only:
    811     try:

File ~/.venvs/dev/lib/python3.11/site-packages/torch/serialization.py:1172, in _load(zip_file, map_location, pickle_module, pickle_file, **pickle_load_args)
   1170 unpickler = UnpicklerWrapper(data_file, **pickle_load_args)
   1171 unpickler.persistent_load = persistent_load
-> 1172 result = unpickler.load()
   1174 torch._utils._validate_loaded_sparse_tensors()
   1176 return result

File ~/.venvs/dev/lib/python3.11/site-packages/flair/embeddings/transformer.py:1210, in TransformerEmbeddings.__setstate__(self, state)
   1207     self.__dict__[key] = embedding.__dict__[key]
   1209 if model_state_dict:
-> 1210     if transformers.__version__ >= Version(4, 31, 0):
   1211         model_state_dict.pop("embeddings.position_ids", None)
   1212     self.model.load_state_dict(model_state_dict)

File ~/.venvs/dev/lib/python3.11/site-packages/semver/version.py:50, in _comparator.<locals>.wrapper(self, other)
     48 if not isinstance(other, comparable_types):
     49     return NotImplemented
---> 50 return operator(self, other)

File ~/.venvs/dev/lib/python3.11/site-packages/semver/version.py:481, in Version.__le__(self, other)
    479 @_comparator
    480 def __le__(self, other: Comparable) -> bool:
--> 481     return self.compare(other) <= 0

File ~/.venvs/dev/lib/python3.11/site-packages/semver/version.py:396, in Version.compare(self, other)
    394 cls = type(self)
    395 if isinstance(other, String.__args__):  # type: ignore
--> 396     other = cls.parse(other)
    397 elif isinstance(other, dict):
    398     other = cls(**other)

File ~/.venvs/dev/lib/python3.11/site-packages/semver/version.py:646, in Version.parse(cls, version, optional_minor_and_patch)
    644     match = cls._REGEX.match(version)
    645 if match is None:
--> 646     raise ValueError(f"{version} is not valid SemVer string")
    648 matched_version_parts: Dict[str, Any] = match.groupdict()
    649 if not matched_version_parts["minor"]:

ValueError: 4.32.0.dev0 is not valid SemVer string

The error only occurs when trying to load the model with an unreleased version of Transformers (e.g. when using it from main branch). So this is not affecting users I guess, but maybe we can find a solution that also works with these kind of versions:

import transformers

print(transformers.__version__)

# Outputs 4.32.0.dev0

@helpmefindaname
Copy link
Collaborator Author

That's a good catch,
I suppose it should be easy to hotfix it by editing the flair/embeddings/transformer.py:1210 line and truncate the version, e.g.
using ".".join(transformers.__version__.split(".")[:2]) instead of transformers.__version__

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Bug]: Support Transformers 4.31.0
3 participants