Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add contrastive loss #801

Merged
merged 3 commits into from
Jun 18, 2017
Merged

add contrastive loss #801

merged 3 commits into from
Jun 18, 2017

Conversation

willduan
Copy link
Contributor

add contrastive loss for siamese network
Computes the constrative loss between y_pred (logits) and y_true (labels).
(http://yann.lecun.com/exdb/publis/pdf/chopra-05.pdf)
Sumit Chopra, Raia Hadsell and Yann LeCun (2005).
Learning a Similarity Metric Discriminatively, with Application to Face Verification.

Arguments:
y_pred: Tensor. Predicted values.
y_true: Tensor. Targets (labels).
margin: . A self-set parameters that indicate the distance between the expected different identity features. Defaults 1.

Copy link
Member

@aymericdamien aymericdamien left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM! Thanks for adding the contrastive loss :)

@aymericdamien aymericdamien merged commit 8ce65f4 into tflearn:master Jun 18, 2017
@polajnta
Copy link

This version of contrastive loss is a bit weird. The other definitions I saw basically compare two sets of outputs and whether they belong to the same class or not. This version is missing the label that indicates whether the outputs belong to the same class. So it cannot be used to do paired loss correctly.

@willduan
Copy link
Contributor Author

willduan commented Jul 16, 2017

@polajnta Hi, the comparision of two siamese nets' outputs this is the Euclidean distance need to add by yourself when you build the network, such as this:

 distance = tf.sqrt(tf.reduce_sum(tf.square(tf.subtract(out1, out2)), 1, keep_dims=True))
 distance = tf.div(distance,  tf.add(tf.sqrt(tf.reduce_sum(tf.square(out1), 1, keep_dims=True)),  tf.sqrt(tf.reduce_sum(tf.square(self.out2), 1, keep_dims=True))))
 distance = tf.reshape(distance, [-1], name="distance")

And the contrastive loss just compute the residual of y_true and the y_pred (that is distance).
So I guess you do not have a complete understanding of siamese network.

@polajnta
Copy link

right i was just expecting a self contained function like in:
https://github.com/tiagofrepereira2012/examples.tensorflow/blob/master/examples/tensorflow/script/train_mnist_siamese.py
L = 0.5 * (Y) * D^2 + 0.5 * (1-Y) * {max(0, margin - D)}^2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants