Why setting label to 0 for all data in gen_data_set_youtube()? #1
-
I'm new to this area so it may be a dumb question. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment
-
hi, it is a good question and I think about this question a long time too. English Version: Additionally, in CrossEntropyLoss, my code can see: PLBaseModel.py If you have a better pytorch version Chinese Version: 那么在 PyTorch 里面的 CrossEntropyLoss 的输入参数是 Target, shape 是 (N);因为我们用交叉熵做 loss, 所以 label 是第一个位置,因此 label 是 0。 代码部分可以查看:https://github.com/bbruceyuan/DeepMatch-Torch/blob/322589cbe7f6d4efb1c4094bd6fb62f599f62c29/deepmatch_torch/models/PLBaseModel.py 中 _step 函数; |
Beta Was this translation helpful? Give feedback.
hi, it is a good question and I think about this question a long time too.
English Version:
As we all know, there is no sampled_softmax_loss like tf. Then I use CrossEntropyLoss instead of sampled_softmax_loss through constructing sample label [1, 0, 0, 0, 0], and set postivate sample displayed in first position, all 0 is sampled negative sample.
Additionally, in CrossEntropyLoss,
Target
shape is (N), so label should be 0 (the index of this positive sample).my code can see: PLBaseModel.py
_step
function. https://github.com/bbruceyuan/DeepMatch-Torch/blob/322589cbe7f6d4efb1c4094bd6fb62f599f62c29/deepmatch_torch/models/PLBaseModel.pyIf you have a better pytorch version
sampled_softmax_loss
…