-
Notifications
You must be signed in to change notification settings - Fork 78
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Is "theoretical bpp" calculated by Hyperprior model? #41
Comments
Hi Yifei Yes. the Hyperprior model tries to find a the probability distribution of the latent code and then calculate the entropy of the distribution and report the entropy as the expected bitrate. Ali |
Hi Ali, If using entropy loss for real bpp, we need to measure the accuracy. Many people just use entropy loss for real bpp in their papers. But it's not correct since they do not compare the entropy loss predicted by the hyper-prior model with the real bpp. Whether their methods are better than BPG is questionable because the BPG uses real file size but their methods are not in their papers. I will address this issue in my paper. Thank you! Sincerely, |
I deeply appreciate your quick response yifeipet. I am going to discuss my understandings from reading papers in this area, and I would be really grateful if you could share your thought anywhere I am wrong.
Sure. No one uses the entropy predicted by the hyper-prior as the real bit rate. It is just an approximation to enable you optimizing your model on the rate distortion trade-off.
As far as I have read papers in neural image compression area, when it comes to compare performance with BPG, JPEG and any other codec, everybody report the actual bitrate which is calculated from the bitstream size of the output of the entropy coder (like arithmetic coder). I totally agree that comparing the entropy loss predicted by the hyperprior model with the final bitrate of BPG is obviously wrong. Regards, |
Hi all,
Sorry for the slow response. I think the theoretical bpp is computed as the
entropy of the latents and hyperlatents as output by their respective
probability models.
…On Mon, Mar 21, 2022 at 7:17 PM Ali Zafari ***@***.***> wrote:
I deeply appreciate your quick response yifeipet. I am going to discuss my
understandings from reading papers in this area, and I would be really
grateful if you could share your thought anywhere I am wrong.
The predicted entropy is not real bits
Sure. No one uses the entropy predicted by the hyper-prior as the real bit
rate. It is just an approximation to enable you optimizing your model on
the rate distortion trade-off.
Many people just use entropy loss for real bpp in their papers.
As far as I have read papers in neural image compression area, when it
comes to compare performance with BPG, JPEG and any other codec, everybody
report the actual bitrate which is calculated from the bitstream size of
the output of the entropy coder (like arithmetic coder). I totally agree
that comparing the entropy loss predicted by the hyperprior model with the
final bitrate of BPG is obviously wrong.
Regards,
Ali
—
Reply to this email directly, view it on GitHub
<#41 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AGRNY6B5UZ4JKNT4HXMD7XLVBDDLVANCNFSM5OE6CZSQ>
.
You are receiving this because you are subscribed to this thread.Message
ID: <Justin-Tan/high-fidelity-generative-compression/issues/41/1074320046@
github.com>
|
Hi Justin, Thank you! Sincerely, |
Hello Justin,
Thank you for your outstanding reference codes and I learned a lot. I got high-quality reconstructed images. I have a question about "theoretical bpp". Is it calculated by the hyperprior model?
Is the hyperprior model accurate because I see the big gap between real bpp and "theoretical bpp"?
Thank you!
Sincerely,
Yifei
The text was updated successfully, but these errors were encountered: