Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Why there is no cos similarity metric? #52

Closed
LiNaihan opened this issue Jan 11, 2018 · 2 comments
Closed

Why there is no cos similarity metric? #52

LiNaihan opened this issue Jan 11, 2018 · 2 comments

Comments

@LiNaihan
Copy link

Hi, @Cysu !
I think your work is wonderful, and this repository really helps a lot in person reid! But I have a few question.
First, I think all your metric is based on Euclidean distance, which is obvious in the function "pairwise_distance" in .reid.evaluator. But as far as I know, cos similarity is also a popular metric.

Second, I noticed that your feature extracted by resnet is the logit calculated by Resnet.classifier, which confused me a lot since I think the feature should be the output of the avgpool layer, which I think is consistent with other author's work. By the way, I noticed the "cut_at_pooling" option in Resnet, which I think is the solution to this problem but should be after
x = F.avg_pool2d(x, x.size()[2:]) x = x.view(x.size(0), -1) instead of before them. I'm really interested in your work, hoping your reply!

@zydou
Copy link
Contributor

zydou commented Jan 20, 2018

If you want cos similarity, you can ref to this to write a cos similarity function:

def cosine_similarity(x1, x2, dim=1, eps=1e-8):
    r"""Returns cosine similarity between x1 and x2, computed along dim.

    .. math ::
        \text{similarity} = \dfrac{x_1 \cdot x_2}{\max(\Vert x_1 \Vert _2 \cdot \Vert x_2 \Vert _2, \epsilon)}

    Args:
        x1 (Variable): First input.
        x2 (Variable): Second input (of size matching x1).
        dim (int, optional): Dimension of vectors. Default: 1
        eps (float, optional): Small value to avoid division by zero.
            Default: 1e-8

    Shape:
        - Input: :math:`(\ast_1, D, \ast_2)` where D is at position `dim`.
        - Output: :math:`(\ast_1, \ast_2)` where 1 is at position `dim`.

    >>> input1 = autograd.Variable(torch.randn(100, 128))
    >>> input2 = autograd.Variable(torch.randn(100, 128))
    >>> output = F.cosine_similarity(input1, input2)
    >>> print(output)
    """
    w12 = torch.sum(x1 * x2, dim)
    w1 = torch.norm(x1, 2, dim)
    w2 = torch.norm(x2, 2, dim)
    return (w12 / (w1 * w2).clamp(min=eps)).squeeze()

Copied from http://pytorch.org/docs/0.2.0/_modules/torch/nn/functional.html#cosine_similarity

@Cysu
Copy link
Owner

Cysu commented Feb 3, 2018

@zydou Thanks a lot!

@ElijhaLee L2-Normalized features + Euclidean distance is equivalent to cosine similarity. Using classification scores rather than pooled features slightly improves performance in my experiments. Not much difference though.

@Cysu Cysu closed this as completed Feb 3, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants