Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use of the margin #56

Open
gombru opened this issue Nov 30, 2017 · 2 comments
Open

Use of the margin #56

gombru opened this issue Nov 30, 2017 · 2 comments

Comments

@gombru
Copy link

gombru commented Nov 30, 2017

Hello,
I'm successfully using your tripletlosslayer.py to train a triplet net, but I have some doubts:

  • The self.margin parameter is only used in the forward() function to compute the loss, but not in backward. As I understand, in Caffe the gradient should be set in bottom[].diff in the backward() function to do the backpropagation. So as the code is, the margin does not affect the training, but only the loss display, right?
  • I see than in setup you set self.a to 1, which you later use in backward for gradient computation. Why do you use that parameter and set it to 1?
@luhaofang
Copy link
Owner

luhaofang commented Dec 1, 2017

Hi, I think the margin is the hyper parameter to modify the positive triplet case, you may find that some triplet samples DON'T need to do backpropagation.

Please ignore the parameter self.a.

@gombru
Copy link
Author

gombru commented Dec 1, 2017

OK. So as I see in the code, the margin value is only used to omit from backpropagration those triplets fulfilling the margin, but has no influence in those triplets that are backpropagated, right?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants