Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Any recent changes to torch? #126

Closed
anuragkumar95 opened this issue Apr 13, 2021 · 6 comments
Closed

Any recent changes to torch? #126

anuragkumar95 opened this issue Apr 13, 2021 · 6 comments

Comments

@anuragkumar95
Copy link

Running into this error.

Environment: docker container python3.7
Command: genienlp train args...

root@5da4bac83a79:/var/app# Traceback (most recent call last):
  File "/usr/local/bin/genienlp", line 5, in <module>
    from genienlp.__main__ import main
  File "/usr/local/lib/python3.7/site-packages/genienlp/__main__.py", line 33, in <module>
    from . import arguments, train, predict, server, cache_embeddings, export
  File "/usr/local/lib/python3.7/site-packages/genienlp/arguments.py", line 39, in <module>
    from .paraphrase.transformers_utils import BART_MODEL_LIST, MBART_MODEL_LIST, MT5_MODEL_LIST
  File "/usr/local/lib/python3.7/site-packages/genienlp/paraphrase/transformers_utils.py", line 7, in <module>
    from transformers import LogitsProcessorList
  File "/usr/local/lib/python3.7/site-packages/transformers/__init__.py", line 626, in <module>
    from .trainer import Trainer
  File "/usr/local/lib/python3.7/site-packages/transformers/trainer.py", line 69, in <module>
    from .trainer_pt_utils import (
  File "/usr/local/lib/python3.7/site-packages/transformers/trainer_pt_utils.py", line 40, in <module>
    from torch.optim.lr_scheduler import SAVE_STATE_WARNING
ImportError: cannot import name 'SAVE_STATE_WARNING' from 'torch.optim.lr_scheduler' (/usr/local/lib/python3.7/site-packages/torch/optim/lr_scheduler.py)
@valkjsaaa
Copy link

I think you need to use PyTorch before 1.8.0: huggingface/transformers#8979

@s-jse
Copy link
Member

s-jse commented Apr 13, 2021

Which version of pytorch and transformers do you have installed? We run genienlp tests with pytorch 1.7.1 and transformers 4.4.

@anuragkumar95
Copy link
Author

As I build the container, I install genienlp using
pip install genienlp

Is the dependency not handled in the setup?

@s-jse
Copy link
Member

s-jse commented Apr 14, 2021

If you install genienlp from pip, it will install genienlp v0.5, which had torch~=1.6 and transformers==4.0 in its requirements. I think if you install it now, it will install pytorch 1.8 and the old version of transformers (4.0) which are not compatible.
You can either install the new genienlp from source by cloning the code, or manually install the old pytorch (1.6) in your docker container after installing genienlp.

@gcampax are we planning to release genienlp v0.6 on PyPI?

@gcampax
Copy link
Contributor

gcampax commented Apr 14, 2021

We need to switch to bootleg 1.* before we can release on pypi, because pypi will reject the git dependency otherwise

@Mehrad0711
Copy link
Member

Fixed in latest release (0.6.0)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

5 participants