Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot load saved AdapterFusion from directory with model.load_adapter_fusion() #54

Closed
2 of 4 tasks
om304 opened this issue Sep 3, 2020 · 5 comments
Closed
2 of 4 tasks
Labels
bug Something isn't working

Comments

@om304
Copy link

om304 commented Sep 3, 2020

馃悰 Bug

Information

Model I am using (Bert, XLNet ...): Bert-base

Language I am using the model on (English, Chinese ...): English

Adapter setup I am using (if any): AdapterFusion

The problem arises when using:

  • the official example scripts: (give details below)
  • my own modified scripts: (give details below)

The tasks I am working on is:

  • an official GLUE/SQUaD task: QQP, SNLI
  • my own task or dataset: (give details below)

To reproduce

Steps to reproduce the behavior:

  1. Train AdapterFusion using run_fusion_glue.py, loading two pre-trained single-task adapters "qqp" and "snli"
  2. AdapterFusion weights and config (adapter_fusion_config.json, pytorch_model_adapter_fusion.bin) saved in a directory /qqp,snli
  3. When trying to load AdapterFusion with model.load_adapter_fusion("qqp,snli") I get the following error message:
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 837, in load_adapter_fusion
    load_dir, load_name = loader.load(adapter_fusion_name_or_path, load_as)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 485, in load
    self.model.add_fusion(adapter_fusion_name, config["config"])
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 716, in add_fusion
    self.base_model.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 585, in add_fusion_layer
    self.encoder.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 479, in add_fusion_layer
    layer.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 461, in add_fusion_layer
    self.attention.output.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 69, in add_fusion_layer
    adapter_config = self.config.adapters.common_config(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_config.py", line 243, in common_config
    adapter_config = AdapterConfig.from_dict(adapter_config)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_config.py", line 87, in from_dict
    return cls(**config)
TypeError: ABCMeta object argument after ** must be a mapping, not NoneType
 

Expected behavior

I would expect to be able to load the trained adapter-fusion from the directory to which it was saved.

Environment info

  • transformers version: 2.11.0
  • Platform: Linux-4.15.0-58-generic-x86_64-with-debian-stretch-sid
  • Python version: 3.7.4
  • PyTorch version (GPU?): 1.4.0 (True)
  • Tensorflow version (GPU?): 2.1.0 (False)
  • Using GPU in script?: True
  • Using distributed or parallel set-up in script?: False
@om304 om304 added the bug Something isn't working label Sep 3, 2020
@calpt
Copy link
Member

calpt commented Sep 3, 2020

Hi @om304,

Thanks for reporting. Your issue might already be fixed with a recent update to the library. Please try to update to the most recent version of the master branch:

pip install -U git+https://github.com/adapter-hub/adapter-transformers.git

@om304
Copy link
Author

om304 commented Sep 4, 2020

Many thanks for the quick reply, I updated the library as recommended and repeated the above steps and now got this error message after running model.load_adapter_fusion():

  File "<stdin>", line 1, in <module>
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 867, in load_adapter_fusion
    load_dir, load_name = loader.load(adapter_fusion_name_or_path, load_as)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 486, in load
    self.model.add_fusion(adapter_fusion_name, config["config"])
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_model_mixin.py", line 746, in add_fusion
    self.base_model.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 582, in add_fusion_layer
    self.encoder.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 473, in add_fusion_layer
    layer.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 455, in add_fusion_layer
    self.attention.output.add_fusion_layer(adapter_names)
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_bert.py", line 69, in add_fusion_layer
    if self.config.adapters.common_config_value(adapter_names, "mh_adapter"):
  File "/home/om304/anaconda3/lib/python3.7/site-packages/transformers/adapter_config.py", line 249, in common_config_value
    config_value = self.get(name).get(attribute, None)
AttributeError: 'NoneType' object has no attribute 'get' 

@calpt
Copy link
Member

calpt commented Sep 7, 2020

This error looks like no adapter was found for one of the fusion tasks you provide, i.e. the model does not include an adapter for either "qqp" or "snli". Did you make sure to load both trained adapter modules using model.load_adapter() before loading the fusion layer using model.load_adapter_fusion()?

@om304
Copy link
Author

om304 commented Sep 7, 2020

That was it, I wasn't loading the individual task adapters with model.load_adapter() before using model.load_adapter_fusion() - many thanks for your help, it's all working now!

@om304 om304 closed this as completed Sep 7, 2020
@Yujin-Yujin
Copy link

Cannot load either

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants