This repository contains a 4-week structured plan to refresh and strengthen my deep learning fundamentals.
It combines theoretical notes, mathematical derivations, and practical implementations in both NumPy (from scratch) and PyTorch (modern frameworks).
- Re-derive and document the key mathematics behind deep learning.
- Implement core models from scratch in NumPy.
- Reproduce the same models using PyTorch.
- Cover essential architectures: MLPs, CNNs, RNNs/LSTMs.
src/→ Source code (NumPy and PyTorch implementations).notebooks/→ Jupyter notebooks with experiments and visualizations.docs/→ Theoretical notes (Markdown + LaTeX formulas).data/→ Links or small datasets for experiments.
- Week 1: Backpropagation from scratch (NumPy) + theory notes.
- Week 2: Optimization algorithms (SGD, Adam) + PyTorch basics.
- Week 3: Deep networks and CNNs.
- Week 4: RNNs and LSTMs for sequences.
Python 3.10+ and the following packages:
pip install numpy matplotlib pandas torch torchvision scikit-learn
Clone the repo and run any notebook:
git clone git@github.com:Limman-qaidev/deep-learning-foundations.git
cd deep-learning-foundations
jupyter notebook notebookds/week1_backprop.ipynbEach week will produce:
- A theoretical markdown document (
docs/). - A Python implementation (
src/). - A Jupyter notebook with experiments (
notebooks/).
-
Andrew Ng – Neural Networks and Deep Learning (Coursera)
-
Michael Nielsen – Neural Networks and Deep Learning (free online book)
-
Aston Zhang, Zachary C. Lipton, Mu Li, Alexander J. Smola – Dive into Deep Learning (D2L)
-
Stanford CS230 – Deep Learning Course Notes
-
Ian Goodfellow, Yoshua Bengio, Aaron Courville – Deep Learning (MIT Press)
-
PyTorch Tutorials – Deep Learning with PyTorch: A 60 Minute Blitz