Official implementation for 'Gradient Aligned Regression via Pairwise Losses'.
Python version: 3.9.19
Prerequisite: torch==2.0.0
from loss import GAR
# define loss function with alpha hyper-parameter.
criterion = GAR(alpha=0.2)
# ground truths: [bs, label_dim]
truths = ...
# predictions: [bs, label_dim]
preds = ...
# compute GAR loss
loss = criterion(preds, truths)
python3 main.py --loss=GAR --dataset=wine_quality --lr=1e-2 --decay=1e-4
python3 main.py --loss=MAE --dataset=wine_quality --lr=1e-2 --decay=1e-4
Make sure you have AgeDB data and pass it to the code by '--data_folder'.
- From scratch:
python3 ageDB_scratch.py --alpha=0.1 --learning_rate=0.5 --weight_decay=1e-4 --loss=GAR --data_folder='your-AgeDB-folder'
- Linear probe:
python3 ageDB_linear.py --alpha=0.1 --learning_rate=0.05 --weight_decay=1e-4 --loss=GAR --data_folder='your-AgeDB-folder' --ckpt='path-to-pretrained-model'
We thank the previous work that provides general experimental settings for AgeDB.
Please check synthetic.ipynb for how to run on the two synthetic (Sine and Squared Sine) datasets.
Please check GAR_analysis.ipynb for how to summarize raw output results to the tables and figures reported in the manuscript.
If you find GAR useful in your work, please cite the following paper:
@misc{zhu2024gradientalignedregressionpairwise,
title={Gradient Aligned Regression via Pairwise Losses},
author={Dixian Zhu and Tianbao Yang and Livnat Jerby},
year={2024},
eprint={2402.06104},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2402.06104},
}