Gradient based Hyperparameter Tuning library in PyTorch
-
Updated
Jul 17, 2020 - Python
Gradient based Hyperparameter Tuning library in PyTorch
Learning Rate Warmup in PyTorch
optimizer & lr scheduler & loss function collections in PyTorch
Polynomial Learning Rate Decay Scheduler for PyTorch
A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
Automatic learning-rate scheduler
Pytorch cyclic cosine decay learning rate scheduler
Warmup learning rate wrapper for Pytorch Scheduler
sharpDARTS: Faster and More Accurate Differentiable Architecture Search
Keras Callback to Automatically Adjust the learning rate when it stops improving
A lightweight but efficient Transformer model for accurate univariate stock price forecasting, designed for real-time trading applications. This project transforms the vanilla Transformer architecture for higher-precision financial time series analysis with minimal computational demands.
End-to-end Image Classification using Deep Learning toolkit for custom image datasets. Features include Pre-Processing, Training with Multiple CNN Architectures and Statistical Inference Tools. Special utilities for RAM optimization, Learning Rate Scheduling, and Detailed Code Comments are included.
A learning rate recommending and benchmarking tool.
Pytorch implementation of arbitrary learning rate and momentum schedules, including the One Cycle Policy
Implementation of fluctuation dissipation relations for automatic learning rate annealing.
Master's thesis: Experiments on multistage step size schedulers for first-order optimization in minimax problems
Used different Transformer based and LSTM based models for forecasting rainfall in different areas of Mumbai. Employed different smart training techniques to improve correlation with the true time-series.
Code in MATLAB for 1st order optimization algorithms implemented for elastic net regularized convex objective functions.
Add a description, image, and links to the learning-rate-scheduling topic page so that developers can more easily learn about it.
To associate your repository with the learning-rate-scheduling topic, visit your repo's landing page and select "manage topics."