Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

For Mutual Learning 是用的Local distances做的N*N的距离矩阵吗? #22

Closed
hongnianwang opened this issue Mar 15, 2018 · 7 comments

Comments

@hongnianwang
Copy link

我看您的代码, 好像是用的Local, 原文用的global……有点疑惑

@huanghoujing
Copy link
Owner

多谢您的关注!

这里global和local的矩阵都可以拿来做mutual learning,可以选择性用哪一个,参见这里这里

@Phoebe-star
Copy link

hi , do you know what is the zero gradient in the paper (3) ?

and your code in the train_ml.py , line 519

Global Distance Mutual Loss (L2 Loss)

  gdm_loss = 0
  if (cfg.num_models > 1) and (cfg.gdm_loss_weight > 0):
    for j in range(cfg.num_models):
      if j != i:
        gdm_loss += torch.sum(torch.pow(
          g_dist_mat - TVT(g_dist_mat_list[j]).detach(), 2))
    gdm_loss /= 1. * (cfg.num_models - 1) * len(ims) * len(ims)

I can't understand " g_dist_mat" and " TVT(g_dist_mat_list[j]).detach()" .
if the "TVT(g_dist_mat_list[j]).detach() " is the zero gradient , than what do you do ?
thanks

@huanghoujing
Copy link
Owner

Hi, yes, detach is my understanding of zero gradient in the paper. When .detach() is applied to a Variable, it returns a new Variable that is a leaf node and stops gradients through itself.

@Phoebe-star
Copy link

could you explain again with 中文? what is the zero gradient ?
thanks you

@huanghoujing
Copy link
Owner

我理解的zero gradient的意思是梯度传到那里就不往下传了,所以这里可以用Variable的detach method来实现。

@Phoebe-star
Copy link

this is mean the mutual loss don't backpropagation, only metric loss do backpropagation

@michuanhaohao
Copy link

@Phoebe-star 你好,我是论文一作,zero grad是指把这个变量当常数来看,原版论文使用了megvii的框架,Pytorch没有这个operator,Pytorch可以用detach来实现

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants