Official implementation of our ICLR 2025 paper Lossy Compression with Pretrained Diffusion Models by Jeremy Vonderfect and Feng Liu. See our project page for an interactive demo of results.
We present a lossy compression method that can leverage state-of-the-art diffusion models for entropy coding. Our method works zero-shot, requiring no additional training of the diffusion model or any ancillary networks. We apply the DiffC algorithm1 to Stable Diffusion 1.5, 2.1, XL, and Flux-dev. We demonstrate that our method is competitive with other state-of-the-art generative compression methods at low ultra-low bitrates.
We compare our method (DiffC) against PerCo, DiffEIC, HiFiC, and MS-ILLM.
In the following rate-distortion curves, SD1.5, SD2.1, SDXL, and Flux represent the DiffC algorithm with those respective diffusion models. The dashed horizontal 'VAE' lines represent the best achievable metrics given the fidelity of the model's variational autoencoder.
git clone https://github.com/JeremyIV/diffc.git
cd diffc
conda env create -f environment.yml
conda activate diffc
python evaluate.py --config configs/SD-1.5-base.yaml --image_dir data/kodak --output_dir results/SD-1.5-base/kodak
To save the compressed representation of an image as a diffc
file, use
python compress.py --config configs/SD-1.5-base.yaml --image_dir data/kodak --output_dir results/SD-1.5-base/kodak/compressed --recon_timestep 200
To reconstruct an image/images from their compressed representations, use
python decompress.py --config configs/SD-1.5-base.yaml --input_dir results/SD-1.5-base/kodak/compressed --output_dir results/SD-1.5-base/kodak/reconstructions
Note that currently, compress and decompress.py only work with SD-1.5-base.yaml
. To make them work with the other configs, you would need to specify manual_dkl_per_step
in the config file.
@inproceedings{
vonderfecht2025lossy,
title={Lossy Compression with Pretrained Diffusion Models},
author={Jeremy Vonderfecht and Feng Liu},
booktitle={The Thirteenth International Conference on Learning Representations},
year={2025},
url={https://openreview.net/forum?id=raUnLe0Z04}
}
Thanks to https://github.com/danieleades/arithmetic-coding for the entropy coding library.
Footnotes
-
Theis, L., Salimans, T., Hoffman, M. D., & Mentzer, F. (2022). Lossy compression with gaussian diffusion. arXiv preprint arXiv:2206.08889. ↩