Skip to content

Latest commit

 

History

History
22 lines (14 loc) · 834 Bytes

optim.md

File metadata and controls

22 lines (14 loc) · 834 Bytes

Meta-Optimization

This directory contains examples of using learn2learn for meta-optimization or meta-descent.

Hypergradient

The script hypergrad_mnist.py demonstrates how to implement a slightly modified version of "Online Learning Rate Adaptation with Hypergradient Descent". The implementation departs from the algorithm presented in the paper in two ways.

  1. We forgo the analytical formulation of the learning rate's gradient to demonstrate the capability of the LearnableOptimizer class.
  2. We adapt per-parameter learning rates instead of updating a single learning rate shared by all parameters.

Usage

!!! warning The parameters for this script were not carefully tuned.

Manually edit the script and run:

python examples/optimization/hypergrad_mnist.py