This project involves building a Single-Layer Perceptron—the fundamental building block of Artificial Neural Networks—entirely from scratch using Python and NumPy.
Instead of using high-level API calls (like keras.layers.Dense), this repository manually implements the mathematical logic for:
- Weighted Sum Calculation (Dot Product).
- Activation Function (Step Function).
- Weight Update Rule (Rosenblatt's Learning Algorithm).
The Perceptron makes a prediction based on linear combinations of input features:
-
Activation: The model uses a Heaviside Step Function: output is
1if$z \ge 0$ , else-1. -
Learning Rule: Weights are updated iteratively based on the error:
$$w := w + \Delta w$$ $$\Delta w = \eta \times (target - prediction) \times x_i$$ (Where $\eta$ is the learning rate)
- Language: Python
- Core Logic: NumPy (for vectorization and efficiency)
- Data Processing: Pandas
- Dataset: Iris Dataset (Binary Classification: Iris-setosa vs. Others)
The implementation is encapsulated in a Perceptron class with the following methods:
-
__init__: Initializes learning rate and epochs. -
fit(X, y): Trains the model by iterating through the dataset and updating weights. -
net_input(X): Calculates the dot product$w \cdot x + b$ . -
predict(X): Applies the step function to return class labels.
- Clone the repository:
git clone [https://github.com/PyPro2024/Perceptron-Algorithm-From-Scratch.git]
- Install dependencies:
pip install numpy pandas scikit-learn
- Run the Notebook:
Open
Perceptron.ipynbin Jupyter Notebook or Google Colab to see the training process and accuracy results.
- Training: The model successfully converges and learns to separate the linearly separable classes of the Iris dataset.
- Accuracy: Achieved 100% accuracy on the test set for this specific binary classification task.
If you find this foundational project helpful, feel free to ⭐ the repo!