Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DPSR error in training #3

Closed
pjh1023 opened this issue Jan 3, 2020 · 2 comments
Closed

DPSR error in training #3

pjh1023 opened this issue Jan 3, 2020 · 2 comments

Comments

@pjh1023
Copy link

pjh1023 commented Jan 3, 2020

/home/pjh/anaconda3/envs/DPSR/lib/python3.7/site-packages/torch/optim/lr_scheduler.py:100: UserWarning: Detected call of lr_scheduler.step() before optimizer.step(). In PyTorch 1.1.0 and later, you should call them in the opposite order: optimizer.step() before lr_scheduler.step(). Failure to do this will result in PyTorch skipping the first value of the learning rate schedule.See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate
"https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate", UserWarning)

I am keep getting this error when I'm trying to run the training code (main_train_dpsr.py).
I've seen the issue#2 that deals with the similar error but I'm wondering if the problem is also solved in the DPSR code! Do you have any solutions?

@pjh1023 pjh1023 changed the title DSPR error in training DPSR error in training Jan 3, 2020
@cszn
Copy link
Owner

cszn commented Jan 3, 2020

It is UserWarning. You can ignore it.

@pjh1023
Copy link
Author

pjh1023 commented Jan 5, 2020

Thanks for your quick answer!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants