Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
-
Updated
Apr 27, 2022 - Python
Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
Testing out Schedule vs Schedule Free Optimizer
Code to conduct experiments for the paper Modified Gauss-Newton method for solving a smooth system of nonlinear equations.
Learn DSPy framework by coding text adventure game
An electrical grid simulator to calculate the least grid cost using optimizers from nevergrad package.
0th order optimizers, gradient chaining, random gradient approximation
implementation of sophia (Second-Order cliPped stocHastic optimizAtion)
Code to conduct experiments for the paper Regularization and acceleration of Gauss-Newton method.
Neural networks framework built from scratch on Python.
Uptodate fork for visualizing the loss landscape of neural nets
A bridge from call back to iterator Acronym: bluesky callback iterator bridge. Motivated that bluesky wants to iterate over plans while solvers typically use call backs
🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
Python Library for creating and training CNNs. Implemented from scratch.
Implementation and comparison of SGD, SGD with momentum, RMSProp and AMSGrad optimizers on the Image classification task using MNIST dataset
JAX implementation of 'Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training'
Effect of Optimizer Selection and Hyperparameter Tuning on Training Efficiency and LLM Performance
This is an application for showing how optimization algorithms work
Aplicação Windows desenvolvida por mim para otimização tanto do sistema como de certos jogos
dm-haiku implementation of hyperbolic neural networks
Add a description, image, and links to the optimizers topic page so that developers can more easily learn about it.
To associate your repository with the optimizers topic, visit your repo's landing page and select "manage topics."