Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DAIA #2

Closed
qiulinzhang opened this issue Aug 24, 2021 · 3 comments
Closed

DAIA #2

qiulinzhang opened this issue Aug 24, 2021 · 3 comments

Comments

@qiulinzhang
Copy link

Thanks for your great paper on SR quantization. I have one problem about the method:

DAIA, Is there any other difference from LSQ expcept your first warm-up to initilize the step size?

or did you make specification of LSQ for SR task, thus you get your Distribution-Aware Interval Adaptation?

@qiulinzhang
Copy link
Author

Thanks for your great paper on SR quantization. I have one problem about the method:

DAIA, Is there any other difference from LSQ expcept your first warm-up to initilize the step size?

or did you make specification of LSQ for SR task, thus you get your Distribution-Aware Interval Adaptation?

Another problem, Is there experiment results which compare with LSQ or EWGS?

@blueardour
Copy link
Member

Hi, we followed the LSQ as the quantizer in the SR task. The main contributions in my opinion are:

  1. Quantizing all layer: not only the non-linear mapping, but also the extraction and reconstruction.
  2. The self-supervised loss function, which show obvious benefit on SR task.

For comparasion with LSQ, please refer to Table 5. There is about 1.3 PSNR improvement of FQSR over the LSQ.

@blueardour
Copy link
Member

Re-open if further questions

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants