Skip to content

A Julia package for adaptive proximal gradient for convex bilevel optimization

Notifications You must be signed in to change notification settings

pylat/adaptive-bilevel-optimization

Repository files navigation

Adaptive Proximal Algorithms for Convex Simple Bilevel optimization

This repository contains Julia code for the paper AdaBiM: An essentially adaptive proximal gradient method for convex simple bilevel optimization.

The problems that can be tackled are of the form

$$ \begin{aligned} \text{minimize} \quad & f^1(x) + g^1(x) \\ \text{subject to} \quad & x \in \arg\min_{w} f^2(w) + g^2(w) \end{aligned} $$

where $f^1,f^2$ are locally Lipschitz differentiable and $g^1,g^2$ are (possibly) nonsmooth prox-friendly functions.

Algorithms are implemented here.

You can download the datasets required in some of the experiments by running:

julia --project=. download_datasets.jl

Numerical simulations for a few different problems are contained in subfolders. For example, the linear inverse problem with the $\ell_1$ norm as the upper-level cost function can be found here. The runme.jl file includes the associated simulations. Executing main() will generate the plots in the same subfolder.

Releases

No releases published

Packages

No packages published

Languages