You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
from the function definition, it looks like the Q is laplacian distribution but Q = exp( - abs(x - mu) / b) / 2b, after taking log it is log(1/2b) - abs(x - mu) / b
when seeing loss = nf_loss + Q_logprob, nf_loss = log(sigma) - log_phi, compare with the paper saying loss = -log(Q(bar_mu)) +log(sigma)-log_phi, I guess the Q_logprob already has the negative sign inside but then, why it is not -log(1/2b) + abs(x-mu)/b? i.e. the first term torch.log(sigma / self.amp) should have the minus sign.
On the other hand, how do you choose the value of b? in the distribution sense, b = sqrt(variance / 2)
The text was updated successfully, but these errors were encountered:
The first term - log(1/2b) = log(2b) = log(sigma) + log(constant). Therefore, it is equivalent to torch.log(sigma). The constant term can be ignored when calculating loss.
For the second question, b = sqrt(varance * 2) = sqrt(2) * sigma in our implementation.
As @nviwch pointed out b = sqrt(variance / 2) = sigma / sort(2)
But in the implementation, b = sqrt(varance * 2) = sigma * sqrt(2)
Is this a bug? @Jeff-sjtu
def logQ(self, gt_uv, pred_jts, sigma):
return torch.log(sigma / self.amp) + torch.abs(gt_uv - pred_jts) / (math.sqrt(2) * sigma + 1e-9)
from the function definition, it looks like the Q is laplacian distribution but Q = exp( - abs(x - mu) / b) / 2b, after taking log it is log(1/2b) - abs(x - mu) / b
when seeing loss = nf_loss + Q_logprob, nf_loss = log(sigma) - log_phi, compare with the paper saying loss = -log(Q(bar_mu)) +log(sigma)-log_phi, I guess the Q_logprob already has the negative sign inside but then, why it is not -log(1/2b) + abs(x-mu)/b? i.e. the first term torch.log(sigma / self.amp) should have the minus sign.
On the other hand, how do you choose the value of b? in the distribution sense, b = sqrt(variance / 2)
The text was updated successfully, but these errors were encountered: