You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Gradient Checking: Demonstrates 1D and ND gradient checking techniques to verify the accuracy of gradients in neural networks. Inspired by DeepLearning.AI's Deep Learning Specialization.
I build the Micrograd autogradient engine, which is a functioning neural network with forward pass, backward propagation, and stochastic gradient descent, all built from scratch. This is derived from the great @karpathy micrograd lecture. Each notebook is complete with Andrei's lecture code and speech, as well as my own code, anecdotes and addition
A gentle introduction to custom gradient propagation for ML application in which parameters of LTI systems have to be optimized. This example enables the integration of control theory with machine learning, for the development of Physical-Informed Neural Networks (PINNs)
This code uses computational graph and neural network to solve the five-layer traffic demand estimation in Sioux Falls network. It also includes comparison of models and 10 cross-validations.
This notebook demonstrates a neural network implementation using NumPy, without TensorFlow or PyTorch. Trained on the MNIST dataset, it features an architecture with input layer (784 neurons), two hidden layers (132 and 40 neurons), and an output layer (10 neurons) with sigmoid activation.
Digit Recognition Neural Network: Built from scratch using only NumPy. Optimised version includes HOG feature extraction. Third version utilises prebuilt ML libraries.