You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
[03/07 20:19:00 d2.data.common]: Serialized dataset takes 424.32 MiB
[03/07 20:19:02 detectron2]: Starting training from iteration 0
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[03/07 20:19:45 d2.utils.events]: eta: 14:33:54 iter: 20 total_loss: 0.9773 caption_loss: 0.1988 loss_box_reg: 0.1348 loss_cls: 0.109 loss_rpn_cls: 0.08208 loss_rpn_loc: 0.06292 ot_loss: 0.3213 time: 0.5014 data_time: 1.6379 lr: 0.00039962 max_mem: 5244M.
...Processing...
I am not sure if it is normal.
The text was updated successfully, but these errors were encountered:
[03/07 20:19:00 d2.data.common]: Serialized dataset takes 424.32 MiB
[03/07 20:19:02 detectron2]: Starting training from iteration 0
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[W reducer.cpp:1303] Warning: find_unused_parameters=True was specified in DDP constructor, but did not find any unused parameters in the forward pass. This flag results in an extra traversal of the autograd graph every iteration, which can adversely affect performance. If your model indeed never has any unused parameters in the forward pass, consider turning this flag off. Note that this warning may be a false positive if your model has flow control causing later iterations to have unused parameters. (function operator())
[03/07 20:19:45 d2.utils.events]: eta: 14:33:54 iter: 20 total_loss: 0.9773 caption_loss: 0.1988 loss_box_reg: 0.1348 loss_cls: 0.109 loss_rpn_cls: 0.08208 loss_rpn_loc: 0.06292 ot_loss: 0.3213 time: 0.5014 data_time: 1.6379 lr: 0.00039962 max_mem: 5244M.
...Processing...
I am not sure if it is normal.
The text was updated successfully, but these errors were encountered: