Skip to content

A minimal neural network implementation using only NumPy, developed as a learning exercise to understand the fundamentals of deep learning from scratch.

Notifications You must be signed in to change notification settings

abardakci/neural-network-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

My neural network implementation using only NumPy, featuring dense layers with ReLU activation, Softmax with categorical cross-entropy loss, forward propagation, and backpropagation.

This is a toy project so a lot of proper training and inference methods are missing for sure, but successfuly handles backpropagation and forward propagation tasks.

Tested with the MNIST dataset and achieved over %90 test accuracy with 2 dense and 1 softmax output layer.

To test the project, open terminal and run "run.py".

This project uses the MNIST dataset, which is publicly available from Yann LeCun's website.

About

A minimal neural network implementation using only NumPy, developed as a learning exercise to understand the fundamentals of deep learning from scratch.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages