Welcome to my Deep Learning reposetory! Here you will find various projects from fully-connected neural networks, recurrent to generative models!
Convolution Neural Networks.
We fist focus on implimenting a Neural Network from scatch: implimented with numpy. In this way, we gain in-depth understanding of backpropagation. Then, using pytorch, we tackle an image classification task on CIFAR-10 dataset with two neural network architectures: multi-layer perceptrons (MLP) and convolutional neural networks (CNN).
- Assignment, report and code
Generated text via LSTM.
Recurrent Neural Networks (RNN) vs Long Short-Term Networks (LSTM); how they differ in modelling long-term dependencies?
After we train a LSTM network in a book, we raised the bar and generate text from it with various of sampling techniques. We used Leo Tolstoys' Anna Karenina book.
- Assignment, report and code
GAN training progress.
We studied and implemented three of the most famous and powerful generative models, namely Variational Auto Encoders (VAEs), Generative Adversarial Networks (GANs) and Generative Normalizing Flows (NFs). We analyzed both from theoretical and practical spectrum, we presented their mathematical framework and results for practical implementations.
- Assignment, report and code
The majority of the projects come from the lab assignments of the Deep Learning course of the MSc in Artificial Intelligence at the University of Amsterdam.