Skip to content

ShiguiLi/EVODiff

Folders and files

NameName
Last commit message
Last commit date

Latest commit

Β 

History

28 Commits
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 
Β 

Repository files navigation

EVODiff Logo

✨ EVODiff: Entropy-aware Variance Optimized Diffusion Inference

NeurIPS 2025

NeurIPS 2025 πŸ”₯πŸ”₯πŸ”₯πŸ”₯

A novel entropy-aware inference framework that enhances the denoising generation capabilities of diffusion models

If you find this work useful, please give us a star 🌟.

Paper β€’ OpenReview β€’ arXiv β€’ Quick Start β€’ Citation

Shigui Li1, Wei Chen1, Delu Zeng2βœ‰οΈ

1 2


πŸ”­ Overview

EVODiff is an efficient diffusion model inference framework grounded in entropy-aware information flow optimization. It systematically improves image quality and accelerates generation by optimizing conditional variance at each step, all without relying on reference trajectories.

We reveal that successful denoising generation of diffusion models fundamentally operates by reducing conditional entropy during reverse transitions, grounded in information-theoretic principles.

EVODiff Information Flow

πŸ’‘ Key Features

EVODiff stands out by being the first entropy-aware diffusion model inference framework for better generation by optimizing the denoising information flow of diffusion models.

  • πŸ›‘οΈ Significant Theoretical Contribution: Provides the first rigorous mathematical proof that data-prediction parameterization is superior to noise-prediction for diffusion model inference, theoretically grounding previous empirical findings. πŸ”₯
  • πŸ“‰ Entropy-aware Denoising: Directly optimizes reconstruction error by systematically leveraging entropy-aware information flow and variance reduction.
  • πŸš€ Reference-free via On-the-fly Optimization: Achieves superior performance without relying on reference trajectories or costly optimization procedures (unlike methods that require optimization or distillation from $\tilde{x}_0$).
Strategies employed for optimizing reconstruction error

Comparison of strategies employed for optimizing reconstruction error across different methods. EVODiff uniquely leverages entropy-aware denoising information flow for better inference with training-free optimization.

πŸ”₯ News

  • [2025.9.19] πŸŽ‰ EVODiff has been accepted by NeurIPS 2025!
  • [Coming Soon] πŸš€ The official implementation code will be released soon!

πŸ–ΌοΈ Better Generation Quality with Better Understanding

EVODiff significantly improves generation quality, especially at low number of function evaluations (NFEs), by effectively reducing uncertainty and mitigating visual artifacts.

Qualitative Results

Qualitative comparisons on text-to-image generation using the simple prompt "Giant caterpillar riding a bicycle". EVODiff leverages entropy-aware information flow to reduce artifacts and enhance fidelity compared to SOTA solvers.

πŸ“ˆ Efficient Generation

Extensive experiments demonstrate that EVODiff consistently outperforms SOTA gradient-based solvers in terms of both speed (NFE) and quality (FID).

Quantitative Results

Quantitative comparisons demonstrating EVODiff's consistent superior performance (lower FID) across diverse datasets (CIFAR-10 shown here) and varying NFEs.

πŸš€ Quick Start

Installation

# Clone the repository
git clone  https://github.com/ShiguiLi/EVODiff.git
cd EVODiff
# Install dependencies
pip install -r requirements.txt

Usage

To use EVODiff, simply initialize the EVODiff_edm class and wrap the sampling process.

python sample.py \
  --ckp_path="path/to/checkpoint.pkl" \
  --sample_folder="my_output_folder" \
  --method="evodiff" \
  --steps=10 \
  --order=2 \
  --skip_type="logSNR" \
  --denoise_to_zero

πŸ“ Citation

If you find EVODiff useful for your research and applications, please cite our work:

🧩 Conference Version (NeurIPS 2025)

@inproceedings{li2025evodiff,
  title={{EVOD}iff: Entropy-aware Variance Optimized Diffusion Inference},
  author={Shigui Li and Wei Chen and Delu Zeng},
  booktitle={The Thirty-ninth Annual Conference on Neural Information Processing Systems},
  year={2025},
  url={https://openreview.net/forum?id=rKASv92Myl}
}

πŸ“˜ Preprint Version (arXiv)

@article{li2025evodiff,
  title={{EVOD}iff: Entropy-aware Variance Optimized Diffusion Inference},
  author={Li, Shigui and Chen, Wei and Zeng, Delu},
  journal={arXiv preprint arXiv:2509.26096},
  year={2025}
}

About

[NeurIPS 2025πŸ”₯:] EVODiff is an inference-time refinement method for diffusion models that improves sampling efficiency and generative fidelity by systematically reducing conditional entropy, without relying on reference trajectories.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages