Skip to content
Course "Theories of Deep Learning"
Branch: master
Clone or download
Latest commit 3287d5e Mar 8, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
hw_lab Add lab 1 Mar 7, 2019
hw_theory Update Feb 23, 2019
lecture_2 Update Feb 25, 2019
lecture_3 Update Mar 8, 2019

TDL logo

This is a GitHub page of Theoretical Deep Learning course held by Neural Networks and Deep Learning Lab., MIPT. The working language of this course is Russian.

Location: Moscow Institute of Physics and Technology, Phystech.Bio building, room 512

Time: Friday, 10:45, starting from 15th of February, 2019.

Videos are available here.

Lecture slides, homework assignments and videos will appear in this repo and will be available for everyone. However, we can guarantee that we will check your homework only if you are a MIPT student.

For MIPT students: in the case you want to take this course into your personal syllabus, the real name of the course is: "Теоретический анализ подходов глубокого обучения".

Further announcements will be in our Telegram chat:


This syllabus is not final and may change. The order of topics will change with high probability.

  1. 15.02.2019 Introduction. Loss landscape of linear networks.

  2. 22.02.2019 Loss landscape of linear networks.

  3. 1.03.2019 Loss landscape of linear res-nets. Loss landscape of wide, but shallow sigmoid nets.

  4. 8.03.2019 No class.

  5. 15.03.2019 Loss landscape of deep and wide sigmoid nets.

  6. 22.03.2019 Spin-glass model. Elimination of local minima. GD almost surely does not converge to strict saddles.

  7. 29.03.2019 Convergence guarantees for noisy GD. GD dynamics on linear networks.

  8. 5.04.2019 GD dynamics on wide, but shallow non-linear networks. Generalization to deep nets.

  9. 12.04.2019 Necessary conditions of learning.

  10. 19.04.2019 (Ivan Skorokodov) The information bottleneck method.

  11. 26.04.2019 Learning guarantees. Rademacher complexity. VC-dimension.

  12. 3.05.2019 No class.

  13. 10.05.2019 No class.

  14. 17.05.2019 Modern approaches in obtaining generalization guarantees.

  15. 24.05.2019 Modern approaches in obtaining generalization guarantees.

  16. 31.05.2019 Reserve day.


  • Basic calculus / probability / linear algebra (matrix differentiation, SVD, eigenvalues, eigenvectors, Hessian, Markov's inequality)
  • Labs are given as jupyter notebooks
  • We use python3; need familiriaty with numpy, pytorch, matplotlib
  • Some experience in DL (not the first time of learning MNIST, familiarity with such words as BatchNorm, ResNet, Dropout)
  • Labs are possible to do on CPU, but it can take quite a long time to train (~1-2 days).


This course will contain (at least) two labs and (at least) three theoretical assignments. There also will be an oral exam (in the form of interview) at the end of the course.

Let p_{labs} = "your points for labs" / "total possible points for labs". Define p_{theory} and p_{exam} analogously.

Your final grade will be computed as follows: grade = min(10, p_{labs} * k_{labs} + p_{theory} * k_{theory} + p_{exam} * k_{exam}), where the coefficents are:

  • k_{labs} = 3
  • k_{theory} = 5
  • k_{exam} = 4

This numbers are not final and can change slightly.

The deadlines for all assignments are computed as follows: "the day, when the assignments appear at this page, 23:59 Moscow time" + 3 weeks. All deadlines are strict.

All homework assignments will appear not more often than once a week.

Send your homeworks to

E-mails should be named as "Lab or theory" + "number" + "-" + "Your Name and Surname"


The first theoretical assignment is out! Deadline: 16.03.2019 23:59 Moscow time.

The first lab assignment is out! Deadline: 28.03.2019 23:59 Moscow time.

Theoretical assignments live here. Labs live here.

Course staff:

  • Eugene Golikov - course admin, lectures, homeworks
  • Ivan Skorokhodov - homework review and beta-test, lecture about information bottleneck, off-screen comments

We also thank Mikhail Arkhipov for gingerbread operating.

This course is dedicated to the memory of Maksim Kretov | 30.12.1986 - 13.02.2019, without whom this course would have never been created.

You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session.