Skip to content
Theoretical Deep Learning: generalization ability
Branch: master
Clone or download
Permalink
Type Name Latest commit message Commit time
Failed to load latest commit information.
README.md Change start time of the 1st lecture Sep 12, 2019
banner3.gif change banner Sep 1, 2019

README.md

TDL logo

This is a GitHub page of the 2nd part of Theoretical Deep Learning course held by Neural Networks and Deep Learning Lab., MIPT. For the first part, see this page. Note that two parts are mostly mutually independent.

The working language of this course is Russian.

Location: Moscow Institute of Physics and Technology, ФИЗТЕХ.ЦИФРА building, room 5-22.

Time: Monday, 10:45. The first lecture (on September 9 16) will start at 11:00 10:45.

Videos will be added to this playlist.

Lecture slides, homework assignments and videos will appear in this repo and will be available for everyone. However, we can guarantee that we will check your homework only if you are a MIPT student.

Further announcements will be in our Telegram chat: https://t.me/joinchat/D_ljjxJHIrD8IuFvfqVLPw

Syllabus:

This syllabus is not final and may change.

  1. Introduction. Short recap of TDL#1. Course structure. Organization notes.

  2. Worst-case generalization bounds. Growth function. Rademacher complexity. Covering numbers. VC-dimension and its variants.

  3. PAC-Bayes bounds. Compressibilty approach.

  4. Implicit bias of gradient descent.

Prerequisites:

  • Basic calculus / probability / linear algebra
  • Labs are given as jupyter notebooks
  • We use python3; need familiriaty with numpy, pytorch, matplotlib
  • Some experience in DL (not the first time of learning MNIST)
  • Labs are possible to do on CPU, but it can take quite a long time to train (~1-2 days).

Grading:

This course will contain N labs and M theoretical assignments. There also will be an oral exam (in the form of interview) at the end of the course.

Let p_{hw} = "your points for homeworks" / "total possible points for homeworks (excluding extra points)". Define p_{exam} analogously.

Your final grade will be computed as follows: grade = min(10, p_{hw} * k_{hw} + p_{exam} * k_{exam}), where the coefficents are:

  • k_{hw} = 4
  • k_{exam} = 8

This numbers are not final and can change.

Send your homeworks to tdl_course_mipt@protonmail.com

E-mails should be named as "Lab or theory" + "number" + "-" + "Your Name and Surname"

Homeworks:

Course staff:

This course is dedicated to the memory of Maksim Kretov | 30.12.1986 - 13.02.2019, without whom this course would have never been created.

You can’t perform that action at this time.