Skip to content

gscriva/n-mcmc

Repository files navigation

Accelerating equilibrium spin-glass simulations using quantum data and deep learning

PyTorch Lightning Config: Hydra Template
Paper DOI

Description

Sampling from the low-temperature Boltzmann distribution of spin glasses is a hard computational task, relevant for physics research and important optimization problems in engineering and finance. Adiabatic quantum computers are being used to tackle the optimization task, corresponding to find the lowest energy spin configuration. In this paper we show how to exploit quantum annealers to accelerate equilibrium Markov chain Monte Carlo simulations of spin glasses at low but finite temperature. Generative neural networks are trained on spin configurations produced by the D-Wave quantum annealers. Moreover, they are used to generate smart proposals for the Metropolis- Hastings algorithm. In particular, we explore hybrid schemes by combining neural and single spin-flip proposals, as well as D-Wave and classical Monte Carlo training data. The hybrid algorithm outperforms the single spin-flip Metropolis-Hastings algorithm and it is competitive with parallel tempering in terms of correlation times, with the significant benefit of a faster equilibration.

For a visual summary (with some results) you can have a look to the notebook article_figures without re-running anything. If you want to reproduce the same plots of the article, dowload the data and install the dependecies before run it.

How to run

Install dependencies

# clone project
git clone https://github.com/gscriva/n-mcmc
cd n-mcmc

# [OPTIONAL] create conda environment
bash bash/setup_conda.sh

# install requirements
pip install -r requirements.txt

Get the data from the Zenodo directory 10.5281/zenodo.7250436 and move them in data/, the directory must be organized as follow:

data
  ├── couplings
  |     ├── 100.txt
  |     .
  |     .
  |     └── 484-z8.txt
  ├── data_for_fig
  |     ├── data_fig1.csv
  |     .
  |     .
  |     .
  |     └── data_fig7.csv
  └── datasets
        ├── 100-1mus
        |    ├── train_1us.npy
        |    └── train_1us.npy
        .
        .
        .
        └── 484-z8-1mus
            ├── ...
            └── ...

Train model with default configuration

# default
python run.py

Train model with chosen experiment configuration from configs/experiment/

# model with 100 spins
python run.py experiment=100spin-1nn.yaml

# models with 484 spins
python run.py experiment=484spin-3nn.yaml

You can generate from the trained model with

python predict.py --ckpt-path=logs/the/trained/model/path.ckpt --model=made 

Citation

@article{10.21468/SciPostPhys.15.1.018,
	title={{Accelerating equilibrium spin-glass simulations using quantum annealers  via generative deep learning}},
	author={Giuseppe Scriva and Emanuele Costa and Benjamin McNaughton and Sebastiano Pilati},
	journal={SciPost Phys.},
	volume={15},
	pages={018},
	year={2023},
	publisher={SciPost},
	doi={10.21468/SciPostPhys.15.1.018},
	url={https://scipost.org/10.21468/SciPostPhys.15.1.018},
}