Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is a warning. #9

Closed
JiuqingDong opened this issue Mar 7, 2023 · 0 comments
Closed

There is a warning. #9

JiuqingDong opened this issue Mar 7, 2023 · 0 comments

Comments

@JiuqingDong
Copy link

[03/07 20:19:00 d2.data.common]: Serialized dataset takes 424.32 MiB
[03/07 20:19:02 detectron2]: Starting training from iteration 0
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[03/07 20:19:45 d2.utils.events]: eta: 14:33:54 iter: 20 total_loss: 0.9773 caption_loss: 0.1988 loss_box_reg: 0.1348 loss_cls: 0.109 loss_rpn_cls: 0.08208 loss_rpn_loc: 0.06292 ot_loss: 0.3213 time: 0.5014 data_time: 1.6379 lr: 0.00039962 max_mem: 5244M.
...Processing...

I am not sure if it is normal.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant