optimizer & lr scheduler & loss function collections in PyTorch
-
Updated
Oct 28, 2024 - Python
optimizer & lr scheduler & loss function collections in PyTorch
Nadir: Cutting-edge PyTorch optimizers for simplicity & composability! 🔥🚀💻
Literature survey of convex optimizers and optimisation methods for deep-learning; made especially for optimisation researchers with ❤️
Benchmarking Optimizers for Sign Language detection
Pytorch implementation of lookahead optimizer(https://arxiv.org/pdf/1907.08610.pdf) and RAdam(https://arxiv.org/pdf/1908.03265.pdf)
tf-keras-implemented YOLOv2
A collection of deep learning models (PyTorch implemtation)
RAdam implemented in Keras & TensorFlow
Classify known sites from around the world, given challenging and very big data set. This project is based on a kaggle competition.
基于tf.keras的多标签多分类模型
Object detection and instance segmentation on MaskRCNN with torchvision, albumentations, tensorboard and cocoapi. Supports custom coco datasets with positive/negative samples.
Quasi Hyperbolic Rectified DEMON Adam/Amsgrad with AdaMod, Gradient Centralization, Lookahead, iterative averaging and decorrelated Weight Decay
python code, notebooks and Images used for AI502 Midterm Project.
MXNet implementation of RAdam optimizer
On The Variance Of The Adaptive Learning Rate And Beyond in tensorflow
Ranger - a synergistic optimizer using RAdam (Rectified Adam) and Lookahead in one codebase
Add a description, image, and links to the radam topic page so that developers can more easily learn about it.
To associate your repository with the radam topic, visit your repo's landing page and select "manage topics."