Experimental codebase accompanying the paper Second-Order Sensitivity Analysis for Bilevel Optimization by Robert Dyro, Edward Schmerling, Nikos Arechiga and Marco Pavone, published in International Conference on Artificial Intelligence and Statistics (AISTATS), 2022.
We do not recommend using this code directly because we maintain a separate user-oriented version of this work implemented in PyTorch and JAX.
You can find it here:
We highly recommend navigating to those packages instead when interacting with this code.
If you find this work useful, please cite this publication with the following BibTeX
@inproceedings{DyroSchmerlingEtAl2022,
author = {Dyro, R. and Schmerling, E. and Arechiga, N. and Pavone, M.},
title = {Second-Order Sensitivity Analysis for Bilevel Optimization},
booktitle = {{Int. Conf. on Artificial Intelligence and Statistics}},
year = {2022},
keywords = {press},
owner = {rdyro},
timestamp = {2022-02-05}
}
-
exps
contains the experiments used in the paperexps/svm
– Support Vector Machine hyperparameter tuningexps/optimal_control
– inverse optimal control with constraintsexps/auto_tuning
– model auto-tuning experimentsexps/shared_scripts
– contains various general-purpose experimental scripts shared by the experiments
-
implicit
contains the main computational packageimplicit/interface.py
– interface bindings to makejax
behave liketorch
implicit/implicit.py
– sensitivity routines and optimization function generation routineimplicit/diff.py
– differentation utilitiesimplicit/opt.py
– optimization routinesimplicit/nn_tools.py
– custom tools for working with neural networks in JAXimplicit/pca.py
– principal component analysis visualization routinesimplicit/inverse.py
– specialized matrix-free inverse methodsimplicit/utils.py
– utility functions
-
unit_tests
contains some sanity checks and unit tests to verify the package is working as expected