Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Query about Tent on semantic segmentation #21

Closed
2018302345 opened this issue May 17, 2023 · 4 comments
Closed

Query about Tent on semantic segmentation #21

2018302345 opened this issue May 17, 2023 · 4 comments

Comments

@2018302345
Copy link

The tent processing classification task requires the entropy of the predicted probability as a loss, how is this loss calculated in the segmentation task.

@qinenergy
Copy link
Owner

qinenergy commented May 17, 2023

Hi, it is similar to the classification case, except that it is calculated for every pixel. You can find an example here:
Use: https://github.com/qinenergy/corda/blob/main/trainUDA_gta.py#L529
Loss: https://github.com/qinenergy/corda/blob/main/utils/loss.py#L35
You can also find our code at #6

@2018302345
Copy link
Author

Thank you very much for your serious answer, in the code acdc-submission\mmseg\apis\test.py of #6 in single_gpu_tent, line 227 predicts result directly as gt_semantic_seg, and I didn't find the replacement of the loss function. In acdc-submission\local_configs\segformer\B5\segformer.b5.1024x1024.acdc.160k.py the loss_decode is 'CrossEntropyLoss', in which case it just calculates the CrossEntropyLoss of two identical tensors, which I don't quite understand.

@qinenergy
Copy link
Owner

  1. In our case, they are not two identical tensors because of the teacher-student model~(pseudo-label is from teacher) and masking.
  2. Even in the simplified identical case, crossEntropy of the same probability~(two identical tensor) gives you entropy by definition, which I don't see any problem nor difference to the classification use case.

If you are not sure, please refer to https://en.wikipedia.org/wiki/Cross_entropy
A similar question that was asked is #18

@2018302345
Copy link
Author

Thank you for your answer and good luck with your research!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants