Skip to content

intsystems/gradhpo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

48 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Test status PEP-8 (flake8) Test coverage Docs status PyPI version License

Research Topic:Short-Horizon Gradient-Based Hyperparameter Optimization
Type of Work:Research Project
Authors:Eynullayev Altay, Rubtsov Denis, Karpeev Gleb

Abstract

Hyperparameter optimization is a fundamental challenge in modern machine learning, requiring the selection of suitable hyperparameters given a validation dataset. Gradient-based methods address this via bilevel optimization, enabling optimization over billion-dimensional search spaces - far beyond the reach of classical approaches such as grid search or Bayesian optimization. This project implements and wraps key gradient-based HPO algorithms as a reusable JAX library: T1-T2 with DARTS numerical approximation, Generalized Greedy Gradient-Based HPO, Online HPO with Hypergradient Distillation. The library provides a unified API suitable for a broad class of tasks, with full documentation and automated testing.

Library Planning

Can be found here.

Technical Report

Draft version can be found here.

Installation

The package is published on PyPI:

pip install gradhpo

Alternatively, install from a source checkout:

git clone https://github.com/intsystems/gradhpo.git
pip install ./gradhpo/src

Software modules developed as part of the study

  1. A python package gradhpo published on PyPI; sources here.
  2. A code with all experiment visualisation here. Can use colab.
  3. Documentation hosted at intsystems.github.io/gradhpo.

About

A collection of short-horizon gradient-based hyperparameter optimization algorithms

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors