You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've been using KNNDivergenceEstimator to estimate the divergence between two samples. This might be a dumb question, but how can the renyi-alpha divergences estimates be negative?
I understand that if I set clamp=True then it imposes a limit, but I'm not sure how the divergence estimator from B. Poczos, L. Xiong, D. J. Sutherland, & J. Schneider (2012) could return negative values.
Thanks!
The text was updated successfully, but these errors were encountered:
So if you look at http://www.gatsby.ucl.ac.uk/~dougals/papers/cvpr-12.pdf: the alpha-beta divergence estimator (3) will always be positive. But the Renyi divergence is 1/(alpha-1) log D_{alpha-1,1-alpha}. So for the usual situation of alpha < 1, the corresponding Renyi estimator is negative if the estimate of D_{alpha-1,1-\alpha} > 1.
I've been using
KNNDivergenceEstimator
to estimate the divergence between two samples. This might be a dumb question, but how can the renyi-alpha divergences estimates be negative?I understand that if I set clamp=True then it imposes a limit, but I'm not sure how the divergence estimator from B. Poczos, L. Xiong, D. J. Sutherland, & J. Schneider (2012) could return negative values.
Thanks!
The text was updated successfully, but these errors were encountered: