This work is licensed under CC BY 4.0 .
All scripts run on Python 3.7 or newer. The following packages are required:
numpy
scipy
mosek
(+ license)sympy
for themultinomial_coefficients_iterator
functionpsutil
for the exemplary command line programs, only used to get the physical CPU count
The Optimizer...py
files define a class Optimizer
that is responsible for carrying out the
optimization.
The filename indicates the problem this corresponds to; this is detailed in the file header.
We additionally provide two command line applications:
-
Main.py
runs the optimizations for given parametersd
,s
, andr
. Various configuration options are available, see the help.The output is stored in a subfolder named
<d> <s> <r>
according to the parameters; the files are named<pdist> <f> <type>.dat
, where<pdist>
is the distillation success probability and<f>
the optimized fidelity. If<type>
ischoi
, it contains the vectorized upper triangle of the Choi state of the distillation map (in the Dicke basis). If<type>
isrho
, it contains the vectorized upper triangle of the input density matrix (in the Dicke basis). If the--erasure
parameter is specified, the optimization is done in the full computational basis andrho
is replaced bypsi
; the file then contains the full input state vector. Additionally, the subfolder is suffixed witherasure
. -
MainAllR.py
runs the optimizations for given parametersd
,s
, andptrans
. Various configuration options are available, see the help.This program uses the data created by
Main.py
(without the--erasure
option) as initial points and therefore can only be run afterwards. It explores the possibility to use different maps for variousr
values.The output is stored in a subfolder named
<ptrans> <d> <s>
; the files are named<ptot> <f> <type>.dat
, where<ptot>
now is the total success probability.
Both applications are parallelized and by default use either the environment variable SLURM_NTASK
(if the SLURM manager is used) or the number of physical cores available.
This can be configured using the --workers
option.
The data that underlies the paper can be generated by appropriate calls of the Main.py
program.
Note that for qubits and low dimensions, this can easily be done on a personal computer, but qudits
and larger values of s
and r
require substantial memory and time resources.
The actual commands may therefore depend on the scheduler.
For example, with SLURM, a shell script Main.sh
may be created that looks as follows:
#!/bin/sh
# potentially set up environment
python Main.py $1 ${SLURM_ARRAY_TASK_ID} $2
Then, a call such as sbatch --ntasks=<multithreading level> --array=2-10 --time=<time constraint> Main.sh 2 1
will queue jobs that create the folders for every configuration 2 2 1
until 2 10 1
with the
corresponding numerical data.
Note that the main application will automatically determine the number of threads from the
appropriate SLURM environment variable.
If a different scheduler is used, this value must be passed in the optional parameter
-workers=<multithreading level>
.