Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Combine two kinds of loss function #45

Closed
wang5566 opened this issue Dec 27, 2017 · 6 comments
Closed

Combine two kinds of loss function #45

wang5566 opened this issue Dec 27, 2017 · 6 comments

Comments

@wang5566
Copy link

I want to train my model using soft max
Loss and triplet loss together. How can I combine two loss in a net together? Any reply will be appreciated.

@Rizhiy
Copy link

Rizhiy commented Dec 28, 2017

You need to write a new loss, similar to the one reid/loss/triplet.py and use that instead. Usually, you can just add losses together, just make sure to scale them since softmax can be much larger than triplet.

You would probably need to write your own data loader since softmax and triplet use different sampling techniques.

@Cysu
Copy link
Owner

Cysu commented Dec 29, 2017

@Rizhiy Thank you very much for the answer!

@wang5566
Copy link
Author

wang5566 commented Jan 5, 2018

@Rizhiy Thank you for your reply. But how to scale this two loss? In my network, I got softmax loss 1.5 and triplet loss 1.27, seeming unusual.

@Rizhiy
Copy link

Rizhiy commented Jan 13, 2018

To be honest I have never tried to train a network with multi-loss. Perhaps train softmax and triplet loss separately first, then calculate the scale such that they become the same value.

@zydou
Copy link
Contributor

zydou commented Jan 20, 2018

You can add a parameter to control the balance between them.
For example:

loss = loss1 + lambda * loss2 

@Cysu
Copy link
Owner

Cysu commented Feb 3, 2018

@Rizhiy @zydou Thanks a lot for your answers!

@Cysu Cysu closed this as completed Feb 3, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants