This project implements and compares optimization algorithms used in neural network training, including:
- Gradient Descent
- Momentum
- Adam
- Learning Rate Decay
The notebook visualizes loss curves and decision boundaries to compare optimizer behavior.
optimization_algs.ipynb– main notebook implementation
- Python
- NumPy
- Matplotlib
- Jupyter Notebook