Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about QFL in FCOS #4

Closed
fangchengji opened this issue Jun 17, 2020 · 14 comments
Closed

Question about QFL in FCOS #4

fangchengji opened this issue Jun 17, 2020 · 14 comments

Comments

@fangchengji
Copy link

Hi, I am really interested in the GFL work. I would want to know your implementation detail about GFL in FCOS. I have tested the QFL in FCOS, but I couldn't get the results in paper. FCOS_R_50_1x with it's ctr. sampling, GIOU loss, Normalization improvements can get 39.2 mAP in Detectron2. It's the same as ATSS with centerness. But when I remove the centerness branch and multiply the centerness ground truth with class score as soft label, using QFL, the result is 38.59 mAP. Could you help me to find out the issue?

image

Thanks!

@implus
Copy link
Owner

implus commented Jun 17, 2020

You mean the pure FCOS can already achieve 39.2 in Detectron2? I think the result of FCOS in original paper and MMDet is around 38.5-38.6, there may be some more tricks in the implementation of Detectron2. By the way, QFL is suggested to use IoU labels (not centerness labels) for implementation, and you need to adjust the related codes accordingly.

We will update the code with FCOS later. During this time, you can feel free to check your code and this repo. Thanks a lot for your interest~!

@fangchengji
Copy link
Author

Thanks for your kindly reply. I think it's just detectron2 standard horizon flip and min edge 640 - 800 resize. So you recommend that I should use IoU labels to generate soft class label. But FCOS is anchor free, how can I calculate IoU? Just move the center point with ground truth box?
Thanks!

@implus
Copy link
Owner

implus commented Jun 17, 2020

IoU label has nothing to do with "anchor-free" or "anchor-based". It is calculated by the predicted box and its corresponding gt box. See line 176 in https://github.com/implus/GFocal/blob/master/mmdet/models/anchor_heads/gfl_head.py for reference.

@fangchengji
Copy link
Author

IoU label has nothing to do with "anchor-free" or "anchor-based". It is calculated by the predicted box and its corresponding gt box. See line 176 in https://github.com/implus/GFocal/blob/master/mmdet/models/anchor_heads/gfl_head.py for reference.

Oh, I see. So you use the pred box and gt box IoU as score to generate soft label. I will try it.
Thanks!

@fangchengji
Copy link
Author

IoU label has nothing to do with "anchor-free" or "anchor-based". It is calculated by the predicted box and its corresponding gt box. See line 176 in https://github.com/implus/GFocal/blob/master/mmdet/models/anchor_heads/gfl_head.py for reference.

If using the pred box and gt box IoU as the score of box to generate soft label, the score is very low at the beginning of training. Does this matter?

@implus
Copy link
Owner

implus commented Jun 17, 2020

It doesn't matter as the provided models are all trained under this scheme.

@implus implus closed this as completed Jun 17, 2020
@fangchengji
Copy link
Author

fangchengji commented Jun 17, 2020

Only got 0.022 mAP boost using GFL with FCOS. Using this FCOS repo(https://github.com/aim-uofa/AdelaiDet) with default config and dIOU loss. Maybe I will try to modified the weight of the class loss and regression loss.

without QFL:

AP AP50 AP75 APs APm APl
39.280 58.088 42.697 23.840 43.023 49.951

with QFL:

AP AP50 AP75 APs APm APl
39.302 58.302 42.604 23.317 43.682 50.606

@implus
Copy link
Owner

implus commented Jun 17, 2020

Only got 0.022 mAP boost using GFL with FCOS. Using this FCOS repo(https://github.com/aim-uofa/AdelaiDet) with default config and dIOU loss. Maybe I will try to modified the weight of the class loss and regression loss.

without GFL:

AP AP50 AP75 APs APm APl
39.280 58.088 42.697 23.840 43.023 49.951
with GFL:

AP AP50 AP75 APs APm APl
39.302 58.302 42.604 23.317 43.682 50.606

You mean using both QFL and DFL in this version of FCOS ? The results are quite different from our experiments... You are suggested to go through the codes in this repo carefully, as it may be not that simple as you thought, e.g., weight needs to be normalized or detached for IoU Loss and DFL Loss. You are also suggested to try this repo to reproduce the 40.2 AP for GFL_R50_1x.

@fangchengji
Copy link
Author

Only got 0.022 mAP boost using GFL with FCOS. Using this FCOS repo(https://github.com/aim-uofa/AdelaiDet) with default config and dIOU loss. Maybe I will try to modified the weight of the class loss and regression loss.
without GFL:
AP AP50 AP75 APs APm APl
39.280 58.088 42.697 23.840 43.023 49.951
with GFL:
AP AP50 AP75 APs APm APl
39.302 58.302 42.604 23.317 43.682 50.606

You mean using both QFL and DFL in this version of FCOS ? The results are quite different from our experiments... You are suggested to go through the codes in this repo carefully, as it may be not that simple as you thought, e.g., weight needs to be normalized or detached for IoU Loss and DFL Loss. You are also suggested to try this repo to reproduce the 40.2 AP for GFL_R50_1x.

Sorry, I only use the QFL. It's a typo. I know the devil is in the detail, I will debug the code. And really thanks for your help. Looking forward to your FCOS GFL implementation code.

@Xudangliatiger
Copy link

I used the implementation of original ATSS with QFL,where the IoU is used as label and loc_weight for GIoU loss is setted as cls_score, and I only get 39.25 AP……

@implus
Copy link
Owner

implus commented Jul 19, 2020 via email

@Xudangliatiger
Copy link

In both this repo and official mmdetection,we have pretrained models,logs and codes. You are welcome to check whether you have missed some important details. For ATSS with QFL,I suggest you try gfl_R50_1× by setting DFL weight to 0. Thank you!

------------------ Original ------------------ From: DongliXu <notifications@github.com> Date: Mon,Jul 20,2020 1:50 AM To: implus/GFocal <GFocal@noreply.github.com> Cc: implus <674592809@qq.com>, State change <state_change@noreply.github.com> Subject: Re: [implus/GFocal] Question about QFL in FCOS (#4) I used the implementation of original ATSS with QFL,where the IoU is used as label and loc_weight for GIoU loss is setted as cls_score, and I only get 39.25 AP…… — You are receiving this because you modified the open/close state. Reply to this email directly, view it on GitHub, or unsubscribe.

Thanks for your reply. Do not take it wrong, It is an elegant work and I like it.

I also wrote a paper last year just using MSE loss to shorten the distance between the cls score and the loc score (IoU). After that, I proposed a warpage loss which is very similar to your QFL, but it seems that you finished QFL earlier than warpage loss~ So I guess it would be hard to publish mine recent work now...

Recently, I was reading your codes but I have not reimplemented QFL in my own repo, I already tried the weight = 0 and it offered the same performance(39.15).

However, I used those IoU from the IoUs loss and .detach_() it, and I noticed you recalculate the IoU for the QFL . Do you think that matters? Or should I use detach() instead of detach_()?

@implus
Copy link
Owner

implus commented Jul 20, 2020

Thanks for your attention~ It is common to find many similar results and methods during our research period, including myself. It doesn't matter if you believe in yourself: You always have the right to continuously iterate yourself, improve yourself, and innovate constantly, until you discover more new technologies.

For the detach() problem, it is suggested to use the form of ``B = A.detach()'' to have a detached copy of A (i.e., B) along with a undetached one A, because you might want gradient for A in some loss but don't require gradient for A in another loss(or loss weight).

Good Luck~!

@wantsjean
Copy link

@fangchengji hello, I notice that you have reimplemented ATSS and GFocal in you forked detectron2, did they get expected boost?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants