Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bits per dimension #3

Closed
tangbinh opened this issue Dec 24, 2018 · 5 comments
Closed

Bits per dimension #3

tangbinh opened this issue Dec 24, 2018 · 5 comments

Comments

@tangbinh
Copy link

Do you know how to map your loss values to bits per dimension results (see Table 2 in the paper)? I'm having a hard time trying to come up with a formula for the correspondence? Some reddit post mentions subtracting math.log(128) to take into account scaling, but it still doesn't seem right.

I looked at the original implementation in Tensorflow but couldn't figure that out. Would you mind letting me know what you think about it? Also, do you know how close your implementation is compared to the original code in terms of bits per dimension? Thank you.

@rosinality
Copy link
Owner

rosinality commented Dec 24, 2018

You can find it at this line https://github.com/openai/glow/blob/master/model.py#L185. Basically it is same as reddit posts. (Subtract log(n_bits) and change log base to 2) I used same formula with tensorflow implementations, but I haven't directly compared a numbers.

@tangbinh
Copy link
Author

Thanks, I'm sure your loss function is comparable to that in the Tensorflow implementation. My question is about how it is related to bits per dimension.

I don't think this bits_x number is comparable to bits per dimension as in the paper. In fact, when I ran the Tensorflow code on CIFAR10, I could get bits_x as low as 2.94 after about 60 epochs (and it definitely gets lower after more training), but the reported value for bits per dimension is 3.35.

@rosinality
Copy link
Owner

I don't know why but it should match. You can also refer to this. openai/glow#43

@tangbinh
Copy link
Author

Never mind. I think you're right. bits_x is what they meant by bits per dimension. I was looking at training statistics, which of course are lower than test statistics.

@braindotai
Copy link

Can anyone please explain what is bit/dimension... I've just started working with GANs, almost every 2nd paper now days comes with some form of bit/dimension metric, I tried really hard to find any literature on it but no information is out there.

Or can anyone at least provide me with any source (repo) using which I can compute this metric on my own generated data?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants