Skip to content
#

backward-propagation

Here are 48 public repositories matching this topic...

I build the Micrograd autogradient engine, which is a functioning neural network with forward pass, backward propagation, and stochastic gradient descent, all built from scratch. This is derived from the great @karpathy micrograd lecture. Each notebook is complete with Andrei's lecture code and speech, as well as my own code, anecdotes and addition

  • Updated Jul 21, 2024
  • Jupyter Notebook

This notebook demonstrates a neural network implementation using NumPy, without TensorFlow or PyTorch. Trained on the MNIST dataset, it features an architecture with input layer (784 neurons), two hidden layers (132 and 40 neurons), and an output layer (10 neurons) with sigmoid activation.

  • Updated Mar 15, 2024
  • Jupyter Notebook

Improve this page

Add a description, image, and links to the backward-propagation topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the backward-propagation topic, visit your repo's landing page and select "manage topics."

Learn more