- Ernest K. Ryu
- [Week 1] Optimization and stochastic gradient descent
- [Week 2] Shallow neural networks and logistic regression.
- [Week 3] Multi-layer perceptron. Softmax regression.
- [Week 4] Convolutional layers, pooling layers, GPU computing, LeNet
- [Week 5] Data augmentation, regularization techniques: dropout, weight decay, early stopping
- [Week 6] Weight initialization, VGGNet, backprop
- [Week 7] Optimizers (ADAM, RMSProp), NiN network, GoogLeNet
- [Week 8] Batch normalization, ResNet, DenseNet
- [Week 9] ResNext, SENet, DNCNN, super-resolution, inverse problem
- [Week 10-11] Flow models
- [Week 12-13] Variational auto-encoders
- [Week 14-15] Generative adversarial networks