Hands-on Deep Learning course with Python + Numpy, from basic concepts to practical implementation.
- Understand deeply how Neural Networks work
- Code from scratch without frameworks (TensorFlow, PyTorch)
- Master the mathematics behind Deep Learning
- Learn through systematic hands-on labs
- Forward Propagation
- Loss Functions (MSE)
- Backward Propagation
- Gradient Descent
- Sigmoid, ReLU, Tanh
- Non-linearity in Neural Networks
- Vanishing gradient problem
- See:
lab02/
- Stacking layers
- Deep neural networks
- Backpropagation through multiple layers
- See:
lab03/
- Binary classification
- Softmax and Cross-entropy
- Multi-class classification
- See:
lab04/
- SGD variations
- Momentum, Adam, RMSprop
- Learning rate scheduling
- See:
lab05/
- Overfitting problem
- L1/L2 regularization
- Dropout
- See:
lab06/
- MNIST handwritten digits
- Data preprocessing and normalization
- Model evaluation metrics
- See:
lab07/
cd lab01/
# Start with the first lab