You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@Zzh-tju Hi,
Thank you for your work, my loss converges a lot faster!But after a few iterations, I meet the same error as #356 when training on a custom dataset.
##################
The same data list works on the official YOLOv3(pjreddie/darknet).
It happened randomly, and there are not 0.0 in annotation .txt-files.
##################
What I have tried:
try random with 0 and 1
try batch/subdivisions with 64/32, 64/64, 32/16, 4/2
try width/height with 608,512,384
try nms_kind wiith diounms and greedynms
I didn't modify the calculation of bbox like this in Yolo3,as said AlexeyAB b.x = (i + logistic_activate(x[index + 0])) / w
I am using Tesla P40 GPU memory size is 20GB. And my anchors = 38,41, 86,52, 65,103, 146,79, 103,164, 180,164, 245,107, 152,256, 254,247 when width=512.
Is there anything else I have missed and what else can cause this error? Thanks!
The text was updated successfully, but these errors were encountered:
OK, I think this problem may caused by the dataset. Once I trained coco2017 before, it was bound to go wrong in hundreds of iters. Setting batch/subdivision to 64/32 will only delay the occurrence time of error report, maybe thousands of iters. And I trained coco2014, it didn't happen again.
@Zzh-tju Hi,
Thank you for your work, my loss converges a lot faster!But after a few iterations, I meet the same error as #356 when training on a custom dataset.
##################
The same data list works on the official YOLOv3(pjreddie/darknet).
It happened randomly, and there are not 0.0 in annotation .txt-files.
##################
What I have tried:
I didn't modify the calculation of bbox like this in Yolo3,as said AlexeyAB
b.x = (i + logistic_activate(x[index + 0])) / w
I am using Tesla P40 GPU memory size is 20GB. And my anchors = 38,41, 86,52, 65,103, 146,79, 103,164, 180,164, 245,107, 152,256, 254,247 when width=512.
Is there anything else I have missed and what else can cause this error? Thanks!
The text was updated successfully, but these errors were encountered: