Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

unexpected keyword argument 'organization' in push_adapter_to_hub #498

Closed
bhavitvyamalik opened this issue Feb 20, 2023 · 3 comments
Closed
Labels
bug Something isn't working

Comments

@bhavitvyamalik
Copy link

Environment info

  • adapter-transformers version: 2.1.2
  • Platform: Linux-5.8.0-59-generic-x86_64-with-glibc2.29
  • Python version: 3.8.10
  • PyTorch version (GPU?): 1.9.0+cu102 (True)
  • Tensorflow version (GPU?): not installed (NA)
  • Flax version (CPU?/GPU?/TPU?): not installed (NA)
  • Jax version: not installed
  • JaxLib version: not installed
  • Using GPU in script?: No
  • Using distributed or parallel set-up in script?: No

Information

Model I am using (Bert, XLNet ...): bert-base-uncased

I'm not able to push my adapter model to hub because of this error:

Traceback (most recent call last):
  File "push_adapter.py", line 26, in <module>
    model.push_adapter_to_hub(
  File "/home/bhavitvya/.cache/pypoetry/virtualenvs/domadapter-VESkl83O-py3.8/lib/python3.8/site-packages/transformers/adapters/hub_mixin.py", line 155, in push_adapter_to_hub
    repo_url = self._get_repo_url_from_name(
  File "/home/bhavitvya/.cache/pypoetry/virtualenvs/domadapter-VESkl83O-py3.8/lib/python3.8/site-packages/transformers/file_utils.py", line 2056, in _get_repo_url_from_name
    return HfApi(endpoint=HUGGINGFACE_CO_RESOLVE_ENDPOINT).create_repo(
  File "/home/bhavitvya/.cache/pypoetry/virtualenvs/domadapter-VESkl83O-py3.8/lib/python3.8/site-packages/huggingface_hub/utils/_validators.py", line 124, in _inner_fn
    return fn(*args, **kwargs)
TypeError: create_repo() got an unexpected keyword argument 'organization'

To reproduce

from transformers import AutoTokenizer
from transformers import AutoModelWithHeads, AutoConfig

config = AutoConfig.from_pretrained("bert-base-uncased", output_hidden_states=True)
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModelWithHeads.from_pretrained("bert-base-uncased", config=config)

full_path = os.path.join(os.getcwd(), "experiments", "fiction_slate", "joint_domain_task_adapter", "10bl8z6y", "checkpoints")
# load adapter checkpoints and activate it
adapter_name = model.load_adapter(full_path)
model.train_adapter([adapter_name])

repo_name = f"joint_dt_fiction_slate"

model.push_adapter_to_hub(
    repo_name=f"{repo_name}",
    adapter_name=adapter_name,
    organization="UDApter",
    adapterhub_tag="nli/multinli",
    datasets_tag="multi_nli"
)

Expected behavior

Should push the model to UDApter repository without any errors.

@bhavitvyamalik bhavitvyamalik added the bug Something isn't working label Feb 20, 2023
@bhavitvyamalik
Copy link
Author

bhavitvyamalik commented Feb 20, 2023

Follow up of #1343 (suggested by @Wauplin)

@Wauplin
Copy link

Wauplin commented Feb 21, 2023

Thanks for opening the issue @bhavitvyamalik. Looking more into it, I think it is solved by PR #473 from @Helw150 (:pray:) that have been merged just yesterday :tada:

So if you install adapter-transformers from the main branch or wait for the next release, it should be fixed!

@calpt
Copy link
Member

calpt commented Mar 6, 2023

Hey all, the fix provided by @Helw150 has been released with the latest version 3.2.0. Therefore closing this issue.

@calpt calpt closed this as completed Mar 6, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

3 participants