| Research Topic: | Short-Horizon Gradient-Based Hyperparameter Optimization |
|---|---|
| Type of Work: | Research Project |
| Authors: | Eynullayev Altay, Rubtsov Denis, Karpeev Gleb |
Hyperparameter optimization is a fundamental challenge in modern machine learning, requiring the selection of suitable hyperparameters given a validation dataset. Gradient-based methods address this via bilevel optimization, enabling optimization over billion-dimensional search spaces - far beyond the reach of classical approaches such as grid search or Bayesian optimization. This project implements and wraps key gradient-based HPO algorithms as a reusable JAX library: T1-T2 with DARTS numerical approximation, Generalized Greedy Gradient-Based HPO, Online HPO with Hypergradient Distillation. The library provides a unified API suitable for a broad class of tasks, with full documentation and automated testing.
Can be found here.
Draft version can be found here.
The package is published on PyPI:
pip install gradhpoAlternatively, install from a source checkout:
git clone https://github.com/intsystems/gradhpo.git
pip install ./gradhpo/src- A python package
gradhpopublished on PyPI; sources here. - A code with all experiment visualisation here. Can use colab.
- Documentation hosted at intsystems.github.io/gradhpo.