Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Runtime error for mmseg #10

Closed
lxtGH opened this issue May 24, 2021 · 2 comments
Closed

Runtime error for mmseg #10

lxtGH opened this issue May 24, 2021 · 2 comments

Comments

@lxtGH
Copy link

lxtGH commented May 24, 2021

RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one. This error indicates that your module has parameters that were not used in producing loss. You can enable unused parameter detection by (1) passing the keyword argument find_unused_parameters=True to torch.nn.parallel.DistributedDataParallel; (2) making sure all forward function outputs participate in calculating loss. If you already have done the above two steps, then the distributed data parallel module wasn't able to locate the output tensors in the return value of your module's forward function. Please include the loss function and the structure of the return value of forward of your module when reporting this issue (e.g. list, dict, iterable).

Hi! Thanks for opensourcing code.
When I use the Twins backbone in mmseg, I found this error. It seems that there are several parameters do not generate the loss.

@cxxgtxy
Copy link
Collaborator

cxxgtxy commented May 24, 2021

The configure file
https://github.com/Meituan-AutoML/Twins/blob/main/segmentation/configs/_base_/default_runtime.py
has set
find_unused_parameters = True.
If you include this file, it acts like passing the keyword argument find_unused_parameters=True to torch.nn.parallel.DistributedDataParallel.

@lxtGH
Copy link
Author

lxtGH commented May 26, 2021

Thanks for your reply. ~

@lxtGH lxtGH closed this as completed May 26, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants