Skip to content

issikaaymeric/Neural-Network

Folders and files

NameName
Last commit message
Last commit date

Latest commit

ย 

History

5 Commits
ย 
ย 
ย 
ย 
ย 
ย 

Repository files navigation

๐Ÿง  Simple Neural Network with NumPy

This project demonstrates how to build basic feedforward neural networks from scratch using only NumPy โ€” no machine learning libraries like TensorFlow or PyTorch involved.

It includes two parts:

  1. A 2-layer neural network (input โ†’ hidden โ†’ output)
  2. A 1-layer neural network (input โ†’ output)

These examples are perfect for educational purposes to understand how neural networks work under the hood.


๐Ÿ“ฆ Requirements

Only one dependency is required:

numpy

Install with:

pip install numpy

๐Ÿ“ File Overview

import numpy as np
  • ๐Ÿ”น Part 1: Two-Layer Neural Network

X = np.array([[0, 0, 1],
              [1, 1, 1],
              [1, 0, 1],
              [0, 1, 1]])
y = np.array([[0, 1, 1, 0]]).T
  • Input: 4 samples, each with 3 binary features
  • Output: XOR-like pattern

The network architecture:

  • Input layer: 3 neurons
  • Hidden layer: 4 neurons
  • Output layer: 1 neuron
  • Activation Function: Sigmoid

Training loop runs for 6,000 iterations using basic forward and backward propagation.


  • ๐Ÿ”น Part 2: One-Layer Neural Network

X = np.array([[0, 0, 1],
              [1, 1, 1],
              [1, 0, 1],
              [0, 1, 1]])
y = np.array([[0, 0, 1, 1]]).T

This version simplifies the network to just:

  • Input layer: 3 neurons
  • Output layer: 1 neuron
  • Activation Function: Sigmoid

Training loop runs for 10,000 iterations.


๐Ÿงฎ Concepts Covered

  • Forward propagation
  • Sigmoid activation function and its derivative
  • Backpropagation (gradient descent)
  • Weight updates using error deltas
  • Manual weight initialization

๐Ÿ“ค Output

At the end of training, the model prints the output predictions for the dataset:

Output after training
[[0.00966449]
 [0.00786506]
 [0.99358911]
 [0.99337711]]

These values approach the target outputs [0, 0, 1, 1], showing the network has learned the task.


๐Ÿ”ง How to Run

  1. Save the script as neural_network.py
  2. Run it with:
python neural_network.py

๐Ÿ“š Learning Resource

This project is inspired by early neural network tutorials, particularly from:


๐Ÿ“ License

MIT License โ€” feel free to modify and use for your own learning or projects.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published