Skip to content
An interactive book on deep learning. Much easy, so MXNet. Wow. [Straight Dope is growing up] ---> Much of this content has been incorporated into the new Dive into Deep Learning Book available at
Branch: master
Clone or download
vishaalkapoor Update
Making redirect to d2l larger.
Latest commit c721100 Mar 23, 2019
Type Name Latest commit message Commit time
Failed to load latest commit information.
build Merge pull request #404 from vishnu667/broken_link_fix Nov 29, 2018
chapter01_crashcourse Small ALT tag correction Dec 6, 2018
chapter02_supervised-learning Merge pull request #549 from lou-k/patch-1 Nov 6, 2018
chapter03_deep-neural-networks Remove outputs from notebooks. Jul 2, 2018
chapter04_convolutional-neural-networks Un-executed all cells as per zackchase request Mar 14, 2018
chapter05_recurrent-neural-networks Change Chapter 5's rnn-gluon notebook to use GPU. Sep 28, 2018
chapter06_optimization Merge pull request #472 from andborth/BorthwickChapter6_OptimizationI… Jun 1, 2018
chapter07_distributed-learning Correct syntax error in Multiple GPUs from Scratch Aug 2, 2018
chapter08_computer-vision fix broken link for Sep 13, 2018
chapter09_natural-language-processing Minor typo fix. GPUS to GPUs. Sep 28, 2017
chapter11_recommender-systems Merge pull request #531 from zackchase/recsys Sep 6, 2018
chapter12_time-series Fixing invalid JSON in two notebooks. Jul 30, 2018
chapter13_unsupervised-learning fix typo in chapter 13 Nov 21, 2018
chapter14_generative-adversarial-networks gan-intro: minor formatting nits Feb 23, 2018
chapter16_tensor_methods TYPO Nov 30, 2017
chapter17_deep-reinforcement-learning Update loading and saving of parameters. Jul 31, 2018
chapter18_variational-methods-and-uncertainty Merge pull request #518 from omdv/master Aug 12, 2018
chapter19_graph-neural-networks Adding chapter on Graph Neural Networks Mar 17, 2018
cheatsheets Update loading and saving of parameters. Jul 31, 2018
data adding adult dataset Nov 16, 2017
docs update chapter install Nov 6, 2018
img updated linear regression tutorial Nov 10, 2017
media taste Jul 24, 2017
.gitattributes moved slides to git LFS Aug 6, 2017
.gitignore fix hacks Aug 30, 2017
LICENSE Create LICENSE Dec 20, 2017
Makefile update CI build Nov 16, 2017 Update Mar 23, 2019
environment.yml update CI build Nov 16, 2017
proto-P02-C02.6-loss.ipynb fix typo Sep 13, 2018

Deep Learning - The Straight Dope

Note: [Straight Dope is growing up] ---> Much of this content has been incorporated into the new Dive into Deep Learning Book available at


This repo contains an incremental sequence of notebooks designed to teach deep learning, MXNet, and the gluon interface. Our goal is to leverage the strengths of Jupyter notebooks to present prose, graphics, equations, and code together in one place. If we're successful, the result will be a resource that could be simultaneously a book, course material, a prop for live tutorials, and a resource for plagiarising (with our blessing) useful code. To our knowledge there's no source out there that teaches either (1) the full breadth of concepts in modern deep learning or (2) interleaves an engaging textbook with runnable code. We'll find out by the end of this venture whether or not that void exists for a good reason.

Another unique aspect of this book is its authorship process. We are developing this resource fully in the public view and are making it available for free in its entirety. While the book has a few primary authors to set the tone and shape the content, we welcome contributions from the community and hope to coauthor chapters and entire sections with experts and community members. Already we've received contributions spanning typo corrections through full working examples.

Implementation with Apache MXNet

Throughout this book, we rely upon MXNet to teach core concepts, advanced topics, and a full complement of applications. MXNet is widely used in production environments owing to its strong reputation for speed. Now with gluon, MXNet's new imperative interface (alpha), doing research in MXNet is easy.


To run these notebooks, you'll want to build MXNet from source. Fortunately, this is easy (especially on Linux) if you follow these instructions. You'll also want to install Jupyter and use Python 3 (because it's 2017).


The authors (& others) are increasingly giving talks that are based on the content in this books. Some of these slide-decks (like the 6-hour KDD 2017) are gigantic so we're collecting them separately in this repo. Contribute there if you'd like to share tutorials or course material based on this books.


As we write the book, large stable sections are simultaneously being translated into 中文, available in a web version and via GitHub source.

Table of contents

Part 1: Deep Learning Fundamentals

Part 2: Applications

Part 3: Advanced Methods


  • Appendix 1: Cheatsheets
    • Roadmap gluon
    • Roadmap PyTorch to MXNet (work in progress)
    • Roadmap Tensorflow to MXNet
    • Roadmap Keras to MXNet
    • Roadmap Math to MXNet

Choose your own adventure

We've designed these tutorials so that you can traverse the curriculum in more than one way.

  • Anarchist - Choose whatever you want to read, whenever you want to read it.
  • Imperialist - Proceed through all tutorials in order. In this fashion you will be exposed to each model first from scratch, writing all the code ourselves but for the basic linear algebra primitives and automatic differentiation.
  • Capitalist - If you don't care how things work (or already know) and just want to see working code in gluon, you can skip (from scratch!) tutorials and go straight to the production-like code using the high-level gluon front end.


This evolving creature is a collaborative effort (see contributors tab). The lead writers, assimilators, and coders include:


In creating these tutorials, we've have drawn inspiration from some the resources that allowed us to learn deep / machine learning with other libraries in the past. These include:


  • Already, in the short time this project has been off the ground, we've gotten some helpful PRs from the community with pedagogical suggestions, typo corrections, and other useful fixes. If you're inclined, please contribute!
You can’t perform that action at this time.