You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
#########
class NCESoftmaxLoss(nn.Module):
"""Softmax cross-entropy loss (a.k.a., info-NCE loss in CPC paper)"""
def init(self):
super(NCESoftmaxLoss, self).init()
self.criterion = nn.CrossEntropyLoss()
def forward(self, x):
bsz = x.shape[0]
x = x.squeeze()
label = torch.zeros([bsz]).cuda().long()
loss = self.criterion(x, label)
return loss
###########
The label for this loss is label = torch.zeros([bsz]).cuda().long(), but in your paper, according to eq.2,
You have one positive for each sample.
So is something missed here??
Thanks.
The text was updated successfully, but these errors were encountered:
Hi,
I see your code for NCESoftmaxLoss as follows:
#########
class NCESoftmaxLoss(nn.Module):
"""Softmax cross-entropy loss (a.k.a., info-NCE loss in CPC paper)"""
def init(self):
super(NCESoftmaxLoss, self).init()
self.criterion = nn.CrossEntropyLoss()
###########
The label for this loss is label = torch.zeros([bsz]).cuda().long(), but in your paper, according to eq.2,
You have one positive for each sample.
So is something missed here??
Thanks.
The text was updated successfully, but these errors were encountered: