Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

detach #6

Open
zhanghongruiupup opened this issue Aug 13, 2020 · 4 comments
Open

detach #6

zhanghongruiupup opened this issue Aug 13, 2020 · 4 comments

Comments

@zhanghongruiupup
Copy link

在do_train里必须用detach截断feats和targets吗?

@zhanghongruiupup
Copy link
Author

你好请问下相关细节,打扰了。 loss = loss + cfg.XBM.WEIGHT * xbm_loss,这样就是2倍batch的数据在做对比loss,为什么不直接loss=xbm_loss了

@bnu-wangxun
Copy link

  1. 是的,需要detach.

  2. loss当前的mini-batch也很重要,所以不是只用xbm_loss。也可以先把当前mini-batch enqueue 到xbm,效果差不多

@wangxinrugithub
Copy link

  1. 是的,需要detach.
  2. loss当前的mini-batch也很重要,所以不是只用xbm_loss。也可以先把当前mini-batch enqueue 到xbm,效果差不多

针对你说的2 我仔细看了代码 当前代码是先把当前的mini_batch enqueue到xbm里面,之后才计算的loss

@wangxinrugithub
Copy link

  1. 是的,需要detach.
  2. loss当前的mini-batch也很重要,所以不是只用xbm_loss。也可以先把当前mini-batch enqueue 到xbm,效果差不多

我只使用xbm_loss。出现了loss不收敛的情况

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants