Skip to content

My course labs for APS360: Applied Deep Learning at the University of Toronto

Notifications You must be signed in to change notification settings

devanshkhare1705/Deep-Learning-Labs

Repository files navigation

APS360-Labs

Course Background

APS360: Applied Deep Learning was a core course for my Certificate in Artificial Intelligence (AECERAIEN) during my B.A.Sc. at University of Toronto. The course covered the theory and implementation of the latest concepts in supervised and unsupervised learning, including how they can be used for generative AI.

Architectures covered: Transformers, Graph Neural Networks, GANs, Variational Autoencoders, RNNs, CNNs, ANNs, etc

Course labs have been overviewed below and attached in the repository.

Labs Overview

Lab 1: Manipulating images and classifying MNIST handwritten digits using Aritificial Neural Networks

  1. Performed fundamental NumPy operations.
  2. Loaded and manipulated images using Matplotlib.
  3. Loaded and mainpulated tensor data in Pytorch.
  4. Configured and trained Artificial Neural Networks (ANN) using PyTorch.
  5. Evaluated different ANN configuations.

Lab 2: Identifying cats vs. dogs using Convolutional Neural Networks

  1. Implemented the training loop for a machine learning model.
  2. Understood the distinction between training, validation, and test data, along with the concepts of overfitting and underfitting.
  3. Investigated how different hyperparameters, such as learning rate and batch size, affected the success of training.
  4. Compared an ANN (aka Multi-Layer Perceptron) with a CNN.

Lab 3: Filling missing census data using Autoencoders

  1. Cleaned and processed continuous and categorical data for machine learning.
  2. Implemented an autoencoder that takes continuous and categorical (one-hot) inputs.
  3. Tuned the hyperparameters of an autoencoder.
  4. Used baseline models to help interpret model performance.

Lab 4: Classifying SPAM/NON-SPAM for SMS using Batched RNNs

  1. Cleaned and processed text data for machine learning.
  2. Implemented a character-level recurrent neural network.
  3. Used torchtext to build recurrent neural network models.
  4. Used torchtext to implement batching for a recurrent neural network.