Official PyTorch implementation of Denoising MCMC for Accelerating Diffusion-Based Generative Models, ICML 2023 Oral Paper.
We propose a general sampling framework, Denoising MCMC (DMCMC), that combines Markov chain Monte Carlo (MCMC) with reverse-SDE/ODE integrators / diffusion models to accelerate score-based sampling. MCMC is used to produces samples in the product space of data and diffusion time, and reverse-SDE/ODE integrators are used to denoise the MCMC samples. Since MCMC travels close to the data manifold, DMCMC can produce high-quality data with reduced computation cost. The figure below illustrates the general idea of DMCMC.
Our framework is compatible with any choice of MCMC, variance-exploding score model, and reverse-SDE/ODE integrator. In particular, combined with integrators of Karras et al. (2022) and pre-trained score models of Song et. al (2021b), DMCMC achieves state-of-the-art results. In the limited number of score function evaluation (NFE) settings on CIFAR10, we have
All main experiments of this paper were performed using the code in the Demo.ipynb
file. To run this code,
- Install required packages,
- Create the directory
./exp/ve
and place score model checkpoints in that directory. We used pre-trained VE NCSN checkpoints provided by Song et al. in this repository. Our method is compatible with RVE NCSN models of Kim et al. as well, provided in this repository.
- Our code is heavily based on the code of Song et al. in this repository.
- FID in our paper was measured using the code in this repository.
If you find the code useful for your research, please consider citing
@article{
kim2022dmcmc,
title={Denoising MCMC for Accelerating Diffusion-Based Generative Models},
author={Beomsu Kim and Jong Chul Ye},
journal={arXiv preprint arXiv:2209.14593},
year={2022}
}