[ɒpˈtɪm ɪ tri] the practice of examining optimization algorithms, by means of suitable instruments or appliances
git clone https://github.com/optimetry/optimetry
cd optimetry
pip install -e .
Or, you know, just pluck from source.
# ...
from torch.optim import SGD
from your_research import CoolNewOptimizer
from optimetry import Graft
M = SGD(model.parameters(), lr=3e-4)
D = CoolNewOptimizer(model.parameters())
MxD = Graft(M, D) # graft M's norms onto D's directions
# ...
MxD.zero_grad()
loss.backward()
MxD.step()@article{agarwal2020disentangling,
title={Disentangling Adaptive Gradient Methods from Learning Rates},
author={Agarwal, Naman and Anil, Rohan and Hazan, Elad and Koren, Tomer and Zhang, Cyril},
journal={arXiv preprint arXiv:2002.11803},
year={2020}
}
- Python >= 3.6
- torch >= 1.7.0
- TF 1.x code lives within Lingvo.