Skip to content

muellerzr/Deep-Learning-From-the-Foundations-Notes

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

62 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Fast.ai Deep Learning from the Foundations (Spring 2019)

Part II of Fast.ai's two-part deep learning course, offered through The Data Institute at USF. From March through the end of April in 2019. Part I is here.

A bottom-up approach (through code, not math equations) to becoming an expert deep learning practitioner and experimenter.

We implemented core fastai and PyTorch classes and modules from scratch, achieving similar or better performance. We also practiced coding up techniques introduced in various papers, and then spent significant time on strategies useful in decreasing model training time (parallelization, JIT).

The final two weeks were spent diving deep into Swift for TensorFlow with Chris Lattner, where we saw first-hand how differentiable programming could work, and experienced the joy of coding deep learning models in a language that actually gets sent directly to the compiler.

All in all, I came away with both the know-how to engineer cutting-edge deep learning ideas from scratch with optimized code, as well as the expertise necessary to research and explore new ideas of my own.

My Reimplementations of Lesson Notebooks

Week 8: Building Optimized Matmul, Forward and Backpropagation from Scratch

Relevant Papers

Week 9: How to Train Your Model

Relevant Papers

Week 10: Wrapping up CNNs

Relevant Papers

Week 11: Data Loading, Optimizers, and Augmentations

Relevant Papers

Week 12: MixUp, XResNets, and ULMFiT

Relevant Papers

About

Foundations Notes

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%