Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cannot load XLM from torch hub - cannot unpack non-iterable NoneType object #4214

Open
arpcode opened this issue Feb 14, 2022 · 4 comments
Open

Comments

@arpcode
Copy link

arpcode commented Feb 14, 2022

xlmr = torch.hub.load('pytorch/fairseq', 'xlmr.large')

I get the following error message -

TypeError                                 Traceback (most recent call last)
/tmp/ipykernel_11808/3675591151.py in <module>
----> 1 xlmr = torch.hub.load('pytorch/fairseq', 'xlmr.large', force_reload = True)

~/anaconda3/envs/btp/lib/python3.9/site-packages/torch/hub.py in load(repo_or_dir, model, source, force_reload, verbose, skip_validation, *args, **kwargs)
    397         repo_or_dir = _get_cache_or_reload(repo_or_dir, force_reload, verbose, skip_validation)
    398 
--> 399     model = _load_local(repo_or_dir, model, *args, **kwargs)
    400     return model
    401 

~/anaconda3/envs/btp/lib/python3.9/site-packages/torch/hub.py in _load_local(hubconf_dir, model, *args, **kwargs)
    423 
    424     hubconf_path = os.path.join(hubconf_dir, MODULE_HUBCONF)
--> 425     hub_module = import_module(MODULE_HUBCONF, hubconf_path)
    426 
    427     entry = _load_entry_from_hubconf(hub_module, model)

~/anaconda3/envs/btp/lib/python3.9/site-packages/torch/hub.py in import_module(name, path)
     74     module = importlib.util.module_from_spec(spec)
     75     assert isinstance(spec.loader, Loader)
---> 76     spec.loader.exec_module(module)
     77     return module
     78 

~/anaconda3/envs/btp/lib/python3.9/importlib/_bootstrap_external.py in exec_module(self, module)

~/anaconda3/envs/btp/lib/python3.9/importlib/_bootstrap.py in _call_with_frames_removed(f, *args, **kwds)

~/.cache/torch/hub/pytorch_fairseq_main/hubconf.py in <module>
     37 
     38 # only do fairseq imports after checking for dependencies
---> 39 from fairseq.hub_utils import (  # noqa; noqa
     40     BPEHubInterface as bpe,
     41     TokenizerHubInterface as tokenizer,

~/.cache/torch/hub/pytorch_fairseq_main/fairseq/__init__.py in <module>
     31 hydra_init()
     32 
---> 33 import fairseq.criterions  # noqa
     34 import fairseq.distributed  # noqa
     35 import fairseq.models  # noqa

~/.cache/torch/hub/pytorch_fairseq_main/fairseq/criterions/__init__.py in <module>
     16 
     17 
---> 18 (
     19     build_criterion_,
     20     register_criterion,

TypeError: cannot unpack non-iterable NoneType object

I've seen this posted time and again, but none of the threads provide a solution.

@YanineeJam
Copy link

You should install the suitable torch version for this command. I recommend installing torch==1.6.0 torchvision==0.7.0. It works for me. :)

@zhouliang-yu
Copy link

I have the same problem. I tried to upgrade the torch and torchvision version to 1.6.0 and 0.7.0 respectively but it doesn't help

@shoang22
Copy link

shoang22 commented Oct 1, 2022

Did you try installing via pip install fairseq? I ran into the same issue when I tried to pip install the editable version. I had to uninstall it and reinstall with the aforementioned command.

@VidushiVashishth
Copy link

pip installing fairseq changes the error to the following for me:

[/usr/lib/python3.8/distutils/dist.py](https://localhost:8080/#) in _parse_command_opts(self, parser, args)
    544         # to be sure that the basic "command" interface is implemented.
    545         if not issubclass(cmd_class, Command):
--> 546             raise DistutilsClassError(
    547                 "command class %s must subclass Command" % cmd_class)
    548 

DistutilsClassError: command class <class 'torch.utils.cpp_extension.BuildExtension'> must subclass Command

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

5 participants