Skip to content

This is the code for the "How to Make a Neural Net" live session by @Sirajology on Youtube

Notifications You must be signed in to change notification settings

llSourcell/make_a_neural_net_live_demo

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 

Repository files navigation

#Make a Neural Net Live Demo

##Overview This is an implementation of a two-layer neural network during the live demo by @Sirajology on Youtube. The training method is stochastic (online) gradient descent with momentum. It computes XOR for the given input. It uses two activation functions, one for each layer. One is a tanh function and the other is the sigmoid function. It uses cross-entropy as it's loss function. This is all done in less than 100 lines of code. We're building this thing from scratch!

##Dependencies

None!

##Usage

Just run the following in terminal to see it run.

python demo.py

##Credits

The credits for the majority of this code go to lightcaster. I've merely created a wrapper to get people started.

About

This is the code for the "How to Make a Neural Net" live session by @Sirajology on Youtube

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages