Bilevel optimization experiments of the paper Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start.
- Install the packages in requirements.txt. The packages version is the one used for the results in the paper, although others could be used. We suggest to use a GPU to speed up the computation. For the random search install the experiment manager Guild.
- Run one of the following files (Optionally modify the arguments in each file):
- DEQs.py for the experiments with equilibrium models.
- meta_learning_parallel.py for the meta-learning experiments.
- poisoning for the experiments on data poisoning adversarial attack .
- poisoning_random_search.py for the random search on the data poisoning experiments (Uses Guild).
hypergrad.py contains the AID hypergradient (i.e. the gradient of the bilevel objective) approximation method which relies on torch.optim.Optimizers
to solve the linear system. The method is taken from hypertorch/hypergrad/hypergradients.py.
See hypertorch for more details on hypergradient approximation methdos and some quick examples on how to incorporate them in a project.
the classTorchBiOptimizer
in bilevel_optimizers.py allows to implement different bilevel optimization methods by varying its parameters. These parameters are specified for several bilevel methods in guild.ylm.
data.py and utils.py contain the loading function for the datasets and some utility functions respectively.
Details on the experimental settings can be found in the paper. Note that the keywords upper-level and lower-level are replaced by outer
and inner
respectively in the code.
If you use this code, please cite our paper.
@article{grazzi2023bilevel,
title={Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start},
author={Grazzi, Riccardo and Pontil, Massimiliano and Salzo, Saverio},
journal={Journal of Machine Learning Research},
volume={24},
number={167},
pages={1--37},
year={2023}
}