Skip to content

CSML-IIT-UCL/bioptexps

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

4 Commits
 
 
 
 
 
 
 
 

Repository files navigation

BioptExps

Bilevel optimization experiments of the paper Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start.

Getting Started

  1. Install the packages in requirements.txt. The packages version is the one used for the results in the paper, although others could be used. We suggest to use a GPU to speed up the computation. For the random search install the experiment manager Guild.
  2. Run one of the following files (Optionally modify the arguments in each file):

Additional Info

hypergrad.py contains the AID hypergradient (i.e. the gradient of the bilevel objective) approximation method which relies on torch.optim.Optimizers to solve the linear system. The method is taken from hypertorch/hypergrad/hypergradients.py. See hypertorch for more details on hypergradient approximation methdos and some quick examples on how to incorporate them in a project.

the classTorchBiOptimizer in bilevel_optimizers.py allows to implement different bilevel optimization methods by varying its parameters. These parameters are specified for several bilevel methods in guild.ylm.

data.py and utils.py contain the loading function for the datasets and some utility functions respectively.

Details on the experimental settings can be found in the paper. Note that the keywords upper-level and lower-level are replaced by outer and inner respectively in the code.

Cite Us

If you use this code, please cite our paper.

@article{grazzi2023bilevel,
  title={Bilevel Optimization with a Lower-level Contraction: Optimal Sample Complexity without Warm-Start},
  author={Grazzi, Riccardo and Pontil, Massimiliano and Salzo, Saverio},
  journal={Journal of Machine Learning Research},
  volume={24},
  number={167},
  pages={1--37},
  year={2023}
}

About

Bilvel Optimization Experiments

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published