Skip to content

Kinnoshachi/TensorFlow-LiveLessons

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

TensorFlow-LiveLessons

This repository is home to the code that accompanies the Deep Learning with TensorFlow LiveLessons that are available within Safari. A high-level summary of these LiveLessons is available on Medium.

Prerequisites

Command Line

Working through these LiveLessons will be easiest if you are familiar with the Unix command line basics. A tutorial of these fundamentals can be found here.

Python for Data Analysis

In addition, if you're unfamiliar with using Python for data analysis (e.g., the pandas, scikit-learn, matplotlib packages), the data analyst path of DataQuest will quickly get you up to speed -- steps one (Introduction to Python) and two (Intermediate Python and Pandas) provide the bulk of the essentials.

Installation

Step-by-step guides for running the code in this repository can be found in the installation directory.

Notebooks

All of the code that I cover in the LiveLessons can be found in this directory as Jupyter notebooks.

Below is the lesson-by-lesson sequence in which I covered them:

Lesson One: Introduction to Deep Learning

1.1 Neural Networks and Deep Learning

  • via analogy to their biological inspirations, this section introduces Artificial Neural Networks and how they developed to their predominantly deep architectures today

1.2 Running the Code in These LiveLessons

1.3 An Introductory Artificial Neural Network

  • get your hands dirty with a simple-as-possible neural network (shallow_net_in_keras.ipynb) for classifying handwritten digits
  • introduces Jupyter notebooks and their most useful hot keys
  • introduces a gentle quantity of deep learning terminology by whiteboarding through:
    • the MNIST digit data set
    • the preprocessing of images for analysis with a neural network
    • a shallow network architecture

Lesson Two: How Deep Learning Works

2.1 The Families of Deep Neural Nets and their Applications

  • talk through the function and popular applications of the predominant modern families of deep neural nets:
    • Dense / Fully-Connected
    • Convolutional Networks (ConvNets)
    • Recurrent Neural Networks (RNNs) / Long Short-Term Memory units (LSTMs)
    • Reinforcement Learning
    • Generative Adversarial Networks

2.2 Essential Theory I —- Neural Units

  • the following essential deep learning concepts are explained with intuitive, graphical explanations:
    • neural units and activation functions

2.3 Essential Theory II -- Cost Functions, Gradient Descent, and Backpropagation

2.4 TensorFlow Playground -- Visualizing a Deep Net in Action

2.5 Data Sets for Deep Learning

  • overview of canonical data sets for image classification and meta-resources for data sets ideally suited to deep learning

2.6 Applying Deep Net Theory to Code I

  • apply the theory learned throughout Lesson Two to create an intermediate-depth image classifier (intermediate_net_in_keras.ipynb)
  • builds on, and greatly outperforms, the shallow architecture from Section 1.3

Lesson Three: Convolutional Networks

3.1 Essential Theory III -- Mini-Batches, Unstable Gradients, and Avoiding Overfitting

  • add to our state-of-the-art deep learning toolkit by delving further into essential theory, specifically:
    • weight initialization
      • uniform
      • normal
      • Xavier Glorot
    • stochastic gradient descent
      • learning rate
      • batch size
      • second-order gradient learning
        • momentum
        • Adam
    • unstable gradients
      • vanishing
      • exploding
    • avoiding overfitting / model generalization
      • L1/L2 regularization
      • dropout
      • artificial data set expansion
    • batch normalization
    • more layers
      • max-pooling
      • flatten

3.2 Applying Deep Net Theory to Code II

  • apply the theory learned in the previous section to create a deep, dense net for image classification (deep_net_in_keras.ipynb)
  • builds on, and outperforms, the intermediate architecture from Section 2.5

3.3 Introduction to Convolutional Neural Networks for Visual Recognition

  • whiteboard through an intuitive explanation of what convolutional layers are and how they're so effective

3.4 Classic ConvNet Architectures -— LeNet-5

  • apply the theory learned in the previous section to create a deep convolutional net for image classification (lenet_in_keras.ipynb) that is inspired by the classic LeNet-5 neural network introduced in section 1.1

3.5 Classic ConvNet Architectures -— AlexNet and VGGNet

3.6 TensorBoard and the Interpretation of Model Outputs

  • return to the networks from the previous section, adding code to output results to the TensorBoard deep learning results-visualization tool
  • explore TensorBoard and explain how to interpret model results within it

Lesson Four: Introduction to TensorFlow

4.1 Comparison of the Leading Deep Learning Libraries

  • discuss the relative strengths, weaknesses, and common applications of the leading deep learning libraries:
    • Caffe
    • Torch
    • Theano
    • TensorFlow
    • and the high-level APIs TFLearn and Keras
  • conclude that, for the broadest set of applications, TensorFlow is the best option

4.2 Introduction to TensorFlow

4.3 Fitting Models in TensorFlow

4.4 Dense Nets in TensorFlow

4.5 Deep Convolutional Nets in TensorFlow

  • create a deep convolutional neural net (lenet_in_tensorflow.ipynb) in TensorFlow with an architecture identical to the LeNet-inspired one built in Keras in Section 3.4

Lesson Five: Improving Deep Networks

5.1 Improving Performance and Tuning Hyperparameters

  • detail systematic steps for improving the performance of deep neural nets, including by tuning hyperparameters

5.2 How to Built Your Own Deep Learning Project

  • specific steps for designing and evaluating your own deep learning project

5.3 Resources for Self-Study

  • topics worth investing time in to become an expert deployer of deep learning models


Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 99.7%
  • Other 0.3%