Skip to content

Latest commit

 

History

History
749 lines (579 loc) · 48.9 KB

README.md

File metadata and controls

749 lines (579 loc) · 48.9 KB

Deep Learning

Deep Learning Using PyTorch

Lecturer: Hossein Hajiabolhassan

Data Science Center

Shahid Beheshti University
Teaching Assistants:
Behnaz H.M. Hoseyni Yavar T. Yeganeh Mostafa Khodayari Esmail Mafakheri

Index:


Course Overview:

In this course, you will learn the foundations of Deep Learning, understand how to build 
neural networks, and learn how to lead successful machine learning projects. You will learn 
about Convolutional networks, RNNs, LSTM, Adam, Dropout, BatchNorm, and more.

Main TextBooks:

Book 1 Book 2 Book 3 Book 4 Book 5

Main TextBooks:
Additional TextBooks:

Slides and Papers:

Recommended Slides & Papers:

  1. Introduction (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  • Video of lecture by Ian Goodfellow and discussion of Chapter 1 at a reading group in San Francisco organized by Alena Kruchkova
  • Paper: On the Origin of Deep Learning by Haohan Wang and Bhiksha Raj
Applied Mathematics and Machine Learning Basics:
  1. Toolkit Lab 1: Google Colab and Anaconda (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Toolkit Lab 2: Getting Started with PyTorch (2 Sessions)

Required Reading:
Suggested Reading:
Additional Resources:
  • Blog: Learning PyTorch with Exampls by Justin Johnson. This tutorial introduces the fundamental concepts of PyTorch through self-contained examples.
Building Dynamic Models Using the Subclassing API:
  1. Deep Feedforward Networks (6 Sessions)

Required Reading:
Interesting Questions:
Suggested Reading:
Additional Resources:
  1. Toolkit Lab 3: Preprocessing Datasets by PyTorch (1 Session)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Regularization for Deep Learning (5 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Toolkit Lab 4: Using a Neural Network to Fit the Data with PyTorch (2 Sessions)

Required Reading:
Suggested Reading:
Additional Resources:
  1. Optimization for Training Deep Models (5 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Convolutional Networks (3 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:  
Fourier Transformation:
  1. Toolkit Lab 5: Using Convolutions to Generalize (2 Sessions)

Required Reading:    
Suggested Reading:
Additional Resources:
  1. Sequence Modeling: Recurrent and Recursive Networks (4 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Toolkit Lab 6: Transfer Learning and Other Tricks (1 Session)

Required Reading:    
Suggested Reading:
Additional Resources:
  1. Practical Methodology (2 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:

Optuna is an automatic hyperparameter optimization software framework, particularly designed for machine learning.

Required Reading:
Suggested Reading:
Additional Resources:
  1. Applications (1 Session)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Autoencoders (2 Sessions)

Required Reading:
Suggested Reading:
Additional Reading:
  1. Generative Adversarial Networks (1 Session)

Required Reading:

Slide: Generative Adversarial Networks (GANs) by Binglin, Shashank, and Bhargav
Paper: NIPS 2016 Tutorial: Generative Adversarial Networks by Ian Goodfellow

Suggested Reading:
Additional Reading:
  1. Graph Neural Networks (1 Session)

Required Reading:
Suggested Reading:
Additional Reading:

Additional Resources:

Class Time and Location:

Saturday and Monday 10:30-12:00 AM (Fall 2020)

Recitation and Assignments:

Tuesday 16:00-18:00 PM (Fall 2020), Refer to the following link to check the assignments.

Projects:

Projects are programming assignments that cover the topic of this course. Any project is written by Jupyter Notebook. Projects will require the use of Python 3.7, as well as additional Python libraries.

Google Colab:

Google Colab is a free cloud service and it supports free GPU!

Fascinating Guides For Machine Learning:

Latex:

The students can include mathematical notation within markdown cells using LaTeX in their Jupyter Notebooks.

  • A Brief Introduction to LaTeX PDF
  • Math in LaTeX PDF
  • Sample Document PDF
  • TikZ: A collection Latex files of PGF/TikZ figures (including various neural networks) by Petar Veličković.

Grading:

  • Projects and Midterm – 50%
  • Endterm – 50%

ُThree Exams:

  • First Midterm Examination: Saturday 1399/09/01, 10:30-12:00
  • Second Midterm Examination: Saturday 1399/10/06, 10:30-12:00
  • Final Examination: Wednesday 1399/10/24, 14:00-16:00

Prerequisites:

General mathematical sophistication; and a solid understanding of Algorithms, Linear Algebra, and Probability Theory, at the advanced undergraduate or beginning graduate level, or equivalent.

Linear Algebra:

Probability and Statistics:

Topics:

Have a look at some reports of Kaggle or Stanford students (CS224N, CS224D) to get some general inspiration.

Account:

It is necessary to have a GitHub account to share your projects. It offers plans for both private repositories and free accounts. Github is like the hammer in your toolbox, therefore, you need to have it!

Academic Honor Code:

Honesty and integrity are vital elements of the academic works. All your submitted assignments must be entirely your own (or your own group's).

We will follow the standard of Department of Mathematical Sciences approach:

  • You can get help, but you MUST acknowledge the help on the work you hand in
  • Failure to acknowledge your sources is a violation of the Honor Code
  • You can talk to others about the algorithm(s) to be used to solve a homework problem; as long as you then mention their name(s) on the work you submit
  • You should not use code of others or be looking at code of others when you write your own: You can talk to people but have to write your own solution/code

Questions?

I will be having office hours for this course on Saturday (09:00 AM--10:00 AM). If this is not convenient, email me at hhaji@sbu.ac.ir or talk to me after class.