Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support alternate model loaders in HFShim and HFWrapper #332

Merged
merged 2 commits into from Jul 5, 2022

Conversation

adrianeboyd
Copy link
Collaborator

In order to support other model loaders such as AutoModelForSequenceClassification, make the loading config / model / tokenizer classes configurable in huggingface_from_pretrained, HFWrapper and HFShim.

In order to support other model loaders such as
`AutoModelForSequenceClassification`, make the loading config / model /
tokenizer classes configurable in `huggingface_from_pretrained`,
`HFWrapper` and `HFShim`.
@adrianeboyd
Copy link
Collaborator Author

I couldn't really figure out sensible types for the args because I don't think there's a single parent class type in transformers. It's possible there's some Union[type[Auto*], type[Pretrained*]] type that might work.

I think it would be also possible to switch to the Callable from_config / from_pretrained methods directly, but this initially seemed to look a bit cleaner?

@adrianeboyd
Copy link
Collaborator Author

And there aren't any other Auto methods for Config or Tokenizer, but someone might want to specify a specific class for some reason in a custom component?

Copy link
Member

@svlandeg svlandeg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Good idea to make this more generic!

spacy_transformers/layers/hf_shim.py Show resolved Hide resolved
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants