optimizer & lr scheduler & loss function collections in PyTorch
-
Updated
Jul 21, 2024 - Python
optimizer & lr scheduler & loss function collections in PyTorch
[PENDING] A lightweight but efficient Transformer model for accurate univariate stock price forecasting, designed for real-time trading applications. This project transforms the vanilla Transformer architecture for higher-precision financial time series analysis with minimal computational demands.
(GECCO2023 Best Paper Nomination) CMA-ES with Learning Rate Adaptation
This project conducts a thorough analysis of weather time series data using diverse statistical and deep learning models. Each model was rigorously applied to the same weather time series data to assess and compare their forecasting accuracy. Detailed results and analyses are provided to delineate the strengths and weaknesses of each approach.
Learning Rate Warmup in PyTorch
Used different Transformer based and LSTM based models for forecasting rainfall in different areas of Mumbai. Employed different smart training techniques to improve correlation with the true time-series.
SPECTRA: Solar Panel Evaluation through Computer Vision and Advanced Techniques for Reliable Analysis
TVLARS - A Fast Convergence Optimizer for Large Batch Training
End-to-end Image Classification using Deep Learning toolkit for custom image datasets. Features include Pre-Processing, Training with Multiple CNN Architectures and Statistical Inference Tools. Special utilities for RAM optimization, Learning Rate Scheduling, and Detailed Code Comments are included.
The goal of this project is to devise an accurate CNN-based classifier able to distinguish between Cat and Dog in images where the animal is predominant.
A method for assigning separate learning rate schedulers to different parameters group in a model.
A learning rate recommending and benchmarking tool.
Flexible parameter scheduler that can be implemented with proprietary and open source optimizers.
Code to reproduce the experiments of ICLR2023-paper: How I Learned to Stop Worrying and Love Retraining
Semester project on the impact of label noise on deep learning optimization
Master's thesis: Experiments on multistage step size schedulers for first-order optimization in minimax problems
Build from scratch
A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
In this repository, I put into test my newly acquired Deep Learning skills in order to solve the Kaggle's famous Image Classification Problem, called "Dogs vs. Cats".
Add a description, image, and links to the learning-rate-scheduling topic page so that developers can more easily learn about it.
To associate your repository with the learning-rate-scheduling topic, visit your repo's landing page and select "manage topics."