Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

cannot import name 'TransformerEncoderLayer' from partially initialized module 'fairseq.modules' (most likely due to a circular import) #321

Open
peminguyen opened this issue Dec 16, 2022 · 1 comment

Comments

@peminguyen
Copy link

/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launch.py:180: FutureWarning: The module torch.distributed.launch is deprecated
and will be removed in future. Use torchrun.
Note that --use_env is set by default in torchrun.
If your script expects `--local_rank` argument to be set, please
change it to read from `os.environ['LOCAL_RANK']` instead. See 
https://pytorch.org/docs/stable/distributed.html#launch-utility for 
further instructions

  warnings.warn(
WARNING:torch.distributed.run:
*****************************************
Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. 
*****************************************
Traceback (most recent call last):
  File "/home/pemi/OFA/run_scripts/glue/../../train.py", line 29, in <module>
    from fairseq import (
  File "/home/pemi/OFA/fairseq/fairseq/quantization_utils.py", line 8, in <module>
    from fairseq.modules.quantization import pq, quantization_options, scalar
  File "/home/pemi/OFA/fairseq/fairseq/modules/__init__.py", line 39, in <module>
    from .transformer_layer import TransformerDecoderLayer, TransformerEncoderLayer
  File "/home/pemi/OFA/fairseq/fairseq/modules/transformer_layer.py", line 15, in <module>
    from fairseq.models.transformer import (
  File "/home/pemi/OFA/fairseq/fairseq/models/__init__.py", line 236, in <module>
    import_models(models_dir, "fairseq.models")
  File "/home/pemi/OFA/fairseq/fairseq/models/__init__.py", line 218, in import_models
    importlib.import_module(namespace + "." + model_name)
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/pemi/OFA/fairseq/fairseq/models/speech_to_text/__init__.py", line 7, in <module>
    from .convtransformer import *  # noqa
  File "/home/pemi/OFA/fairseq/fairseq/models/speech_to_text/convtransformer.py", line 19, in <module>
    from fairseq.modules import LayerNorm, PositionalEmbedding, TransformerEncoderLayer
ImportError: cannot import name 'TransformerEncoderLayer' from partially initialized module 'fairseq.modules' (most likely due to a circular import) (/home/pemi/OFA/fairseq/fairseq/modules/__init__.py)
Traceback (most recent call last):
  File "/home/pemi/OFA/run_scripts/glue/../../train.py", line 29, in <module>
    from fairseq import (
  File "/home/pemi/OFA/fairseq/fairseq/quantization_utils.py", line 8, in <module>
    from fairseq.modules.quantization import pq, quantization_options, scalar
  File "/home/pemi/OFA/fairseq/fairseq/modules/__init__.py", line 39, in <module>
    from .transformer_layer import TransformerDecoderLayer, TransformerEncoderLayer
  File "/home/pemi/OFA/fairseq/fairseq/modules/transformer_layer.py", line 15, in <module>
    from fairseq.models.transformer import (
  File "/home/pemi/OFA/fairseq/fairseq/models/__init__.py", line 236, in <module>
    import_models(models_dir, "fairseq.models")
  File "/home/pemi/OFA/fairseq/fairseq/models/__init__.py", line 218, in import_models
    importlib.import_module(namespace + "." + model_name)
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/importlib/__init__.py", line 127, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "/home/pemi/OFA/fairseq/fairseq/models/speech_to_text/__init__.py", line 7, in <module>
    from .convtransformer import *  # noqa
  File "/home/pemi/OFA/fairseq/fairseq/models/speech_to_text/convtransformer.py", line 19, in <module>
    from fairseq.modules import LayerNorm, PositionalEmbedding, TransformerEncoderLayer
ImportError: cannot import name 'TransformerEncoderLayer' from partially initialized module 'fairseq.modules' (most likely due to a circular import) (/home/pemi/OFA/fairseq/fairseq/modules/__init__.py)
ERROR:torch.distributed.elastic.multiprocessing.api:failed (exitcode: 1) local_rank: 0 (pid: 6079) of binary: /home/pemi/miniconda3/envs/env/bin/python3
Traceback (most recent call last):
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/runpy.py", line 197, in _run_module_as_main
    return _run_code(code, main_globals, None,
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/runpy.py", line 87, in _run_code
    exec(code, run_globals)
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launch.py", line 195, in <module>
    main()
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launch.py", line 191, in main
    launch(args)
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launch.py", line 176, in launch
    run(args)
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/run.py", line 753, in run
    elastic_launch(
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launcher/api.py", line 132, in __call__
    return launch_agent(self._config, self._entrypoint, list(args))
  File "/home/pemi/miniconda3/envs/env/lib/python3.9/site-packages/torch/distributed/launcher/api.py", line 246, in launch_agent
    raise ChildFailedError(
torch.distributed.elastic.multiprocessing.errors.ChildFailedError: 
============================================================
../../train.py FAILED
------------------------------------------------------------
Failures:
[1]:
  time      : 2022-12-16_09:50:47
  host      : baker
  rank      : 1 (local_rank: 1)
  exitcode  : 1 (pid: 6080)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
------------------------------------------------------------
Root Cause (first observed failure):
[0]:
  time      : 2022-12-16_09:50:47
  host      : baker
  rank      : 0 (local_rank: 0)
  exitcode  : 1 (pid: 6079)
  error_file: <N/A>
  traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html
============================================================

@yangapku
Copy link
Member

yangapku commented Jan 5, 2023

Please refer to #217 #225 #249 and see if they help.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants