Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

mutual loss #15

Closed
wentianli opened this issue Jan 24, 2018 · 3 comments
Closed

mutual loss #15

wentianli opened this issue Jan 24, 2018 · 3 comments

Comments

@wentianli
Copy link

Have you tried pml, gdml, and ldml seperately? Looks like you only used gdml for your current results on Market1501.
Recently I'm having trouble reproducing the results for mutual classification loss (with my own code):(
Thx.

@huanghoujing
Copy link
Owner

I have tried simple combination of classification loss and mutual loss, yet also in another project. I used two ResNet-50. The CMC Rank-1 accuray of baseline is ~84%; adding mutual loss gains over 1 point improvement.

In this project, GL + IDL + ML + TWGD has additional probability mutual loss and global distance mutual loss than baseline GL + IDL + TWGD. I did not try adding only probability mutual loss here.

@Phoebe-star
Copy link

Phoebe-star commented Mar 7, 2018

hi
I'm having some troubles

  1. in the paper, the classification loss is how to compute?
    input a image (224x224) to resnet 50 , the output is 7x7x2048 feature , and then? I have no idea
    is it using fully conv and softmax to calssification?

  2. how about mutual loss? Is it just combine two different loss
    ( for example, triplet loss + classification loss = mutual loss

    how about metric loss?

    thanks you

@huanghoujing
Copy link
Owner

  1. You can look into the structure of ResNet-50. For input of size 224x224x3, the output of ResNet-50 conv5 is 7x7x2048. Then there is a average pooling layer reducing it to 1x1x2048. Then there is the final FC layer. Besides, classification loss is calculated as in object recognition task.

  2. Mutual loss is not the sum of losses. You can refer to paper Deep Mutual Learning.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants