New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Bits per dimension #3
Comments
You can find it at this line https://github.com/openai/glow/blob/master/model.py#L185. Basically it is same as reddit posts. (Subtract log(n_bits) and change log base to 2) I used same formula with tensorflow implementations, but I haven't directly compared a numbers. |
Thanks, I'm sure your loss function is comparable to that in the Tensorflow implementation. My question is about how it is related to bits per dimension. I don't think this |
I don't know why but it should match. You can also refer to this. openai/glow#43 |
Never mind. I think you're right. |
Can anyone please explain what is bit/dimension... I've just started working with GANs, almost every 2nd paper now days comes with some form of bit/dimension metric, I tried really hard to find any literature on it but no information is out there. Or can anyone at least provide me with any source (repo) using which I can compute this metric on my own generated data? |
Do you know how to map your loss values to bits per dimension results (see Table 2 in the paper)? I'm having a hard time trying to come up with a formula for the correspondence? Some reddit post mentions subtracting
math.log(128)
to take into account scaling, but it still doesn't seem right.I looked at the original implementation in Tensorflow but couldn't figure that out. Would you mind letting me know what you think about it? Also, do you know how close your implementation is compared to the original code in terms of bits per dimension? Thank you.
The text was updated successfully, but these errors were encountered: