Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
-
Updated
Apr 27, 2022 - Python
Implementing Neural Network from scratch (MLP and CNN), purely in numpy with optimizers
Code to conduct experiments for the paper Modified Gauss-Newton method for solving a smooth system of nonlinear equations.
An electrical grid simulator to calculate the least grid cost using optimizers from nevergrad package.
Lion - EvoLved Sign Momentum w/ New Optimizer API in TensorFlow 2.11+
implementation of sophia (Second-Order cliPped stocHastic optimizAtion)
Implementation and comparison of SGD, SGD with momentum, RMSProp and AMSGrad optimizers on the Image classification task using MNIST dataset
Learn DSPy framework by coding text adventure game
🧑🏫 50! Implementations/tutorials of deep learning papers with side-by-side notes 📝; including transformers (original, xl, switch, feedback, vit, ...), optimizers (adam, adabelief, ...), gans(cyclegan, stylegan2, ...), 🎮 reinforcement learning (ppo, dqn), capsnet, distillation, ... 🧠
This is an application for showing how optimization algorithms work
Python Library for creating and training CNNs. Implemented from scratch.
0th order optimizers, gradient chaining, random gradient approximation
Code to conduct experiments for the paper Regularization and acceleration of Gauss-Newton method.
dm-haiku implementation of hyperbolic neural networks
Neural networks framework built from scratch on Python.
Uptodate fork for visualizing the loss landscape of neural nets
A bridge from call back to iterator Acronym: bluesky callback iterator bridge. Motivated that bluesky wants to iterate over plans while solvers typically use call backs
JAX implementation of 'Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training'
A collection of optimizers, some arcane others well known, for Flax.
Optim4RL is a Jax framework of learning to optimize for reinforcement learning.
Add a description, image, and links to the optimizers topic page so that developers can more easily learn about it.
To associate your repository with the optimizers topic, visit your repo's landing page and select "manage topics."