Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bigbird fusion #425

Open
wants to merge 10 commits into
base: master
Choose a base branch
from
Open

Conversation

Soham2000
Copy link

Adding Adapter support for BigBird Transformer Architecture

Copy link
Member

@calpt calpt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey @Soham2000,
thanks a lot for your efforts in adding adapter support to the BigBird model architecture. I've done an initial review and left some comments below. To complete the model integration, please make sure to follow the remaining implementation steps of our contribution guide, especially in the testing and documentation sections.

Let us know if you need any further help!

@@ -146,7 +146,7 @@
"models.bert_generation": ["BertGenerationConfig"],
"models.bert_japanese": ["BertJapaneseTokenizer", "CharacterTokenizer", "MecabTokenizer"],
"models.bertweet": ["BertweetTokenizer"],
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig"],
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig","BigBirdTokenizer"],
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig","BigBirdTokenizer"],
"models.big_bird": ["BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP", "BigBirdConfig"],

@@ -2985,7 +2986,7 @@
from .models.bert_generation import BertGenerationConfig
from .models.bert_japanese import BertJapaneseTokenizer, CharacterTokenizer, MecabTokenizer
from .models.bertweet import BertweetTokenizer
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig,BigBirdTokenizer
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig,BigBirdTokenizer
from .models.big_bird import BIG_BIRD_PRETRAINED_CONFIG_ARCHIVE_MAP, BigBirdConfig

@@ -0,0 +1,240 @@
# flake8: noqa
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove this file

Comment on lines +989 to +995

# print("#=================================================================================================")
# # print(self) #Robertamodelwithheads #Big BirdModel with heads
# print("#====================================================================================================")
# print(self.base_model) #Robertamodel #BigBirdmodel
# print("#======================================================================================================")

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove

@@ -0,0 +1,74 @@
import warnings
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove

self.add_prediction_head(head, overwrite_ok=overwrite_ok)


class BigBirdModelWithHeads(BigBirdAdapterModel):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Since the ...ModelWithHeads classes are deprecated, we don't want to add them for new model architectures anymore. Thus, please remove this class, thanks.

@@ -30,6 +30,7 @@

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove changes in this file

@@ -917,7 +921,7 @@ def check_model_type(self, supported_models: Union[List[str], dict]):
else:
supported_models_names.append(model.__name__)
supported_models = supported_models_names
for item in ADAPTER_MODEL_MAPPING.values():
for item in MODEL_WITH_HEADS_MAPPING.values():
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please revert this change

@calpt calpt linked an issue Oct 13, 2022 that may be closed by this pull request
@calpt
Copy link
Member

calpt commented Sep 9, 2023

Hey, thanks again for your efforts in contributing new model architectures to adapter-transformers and sorry for the silence on our side.

In the last few weeks, we've been working on a large refactoring of our project, which will ultimately result in the release of Adapters, the next-generation adapters library. See #584.

As a consequence, we plan to merge any new model integrations directly to the new codebase, which currently can be found on this branch. Unfortunately, this necessitates some changes in the model integration code (detailed here, see already integrated models such as BERT, BART etc. for reference).

If you'd be willing to update your model integration to target the new library yourself, we'd be super happy to help you on this. Otherwise, we might look into upgrading and merging some of the open model integration PRs ourselves in the future. For more details, again see #584.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Add support for BigBird
2 participants