Skip to content

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

License

Notifications You must be signed in to change notification settings

ethanblagg/t81_558_deep_learning

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

T81 558:Applications of Deep Neural Networks

Washington University in St. Louis

Instructor: Jeff Heaton

Fall 2017, Mondays, Class Room: Psychology #249

Please note, this semester is using TensorFlow 1.0.

Course Description

Deep learning is a group of exciting new technologies for neural networks. By using a combination of advanced training techniques and neural network architectural components, it is now possible to train neural networks of much greater complexity. This course will introduce the student to deep belief neural networks, regularization units (ReLU), convolution neural networks and recurrent neural networks. High performance computing (HPC) aspects will demonstrate how deep learning can be leveraged both on graphical processing units (GPUs), as well as grids. Deep learning allows a model to learn hierarchies of information in a way that is similar to the function of the human brain. Focus will be primarily upon the application of deep learning, with some introduction to the mathematical foundations of deep learning. Students will use TensorFlow and Keras with the Python programming language to architect a deep learning model for several of real-world data sets and interpret the results of these networks.

Objectives

  1. Explain how neural networks (deep and otherwise) compare to other machine learning models.
  2. Determine when a deep neural network would be a good choice for a particular problem.
  3. Demonstrate your understanding of the material through a final project uploaded to GitHub.

Syllabus

This syllabus presents the expected class schedule, due dates, and reading assignments. Download current syllabus.

Class Content
Class 1
08/28/2017
  • Python for Machine Learning
LABOR DAY
09/04/2017
** No class labor day **
Class 2
09/11/2017
  • Neural Network & Machine Learning Basics
  • Introduction to TensorFlow
  • Assignment: Read Chapter 1
Class 3
09/18/2017
  • Training a Neural Network
  • Assignment: Read Chapters 4 & 5
Class 4
09/25/2017
  • Classification & Regression
Class 5
10/02/2017
  • Backpropagation
  • Assignment: Read Chapter 6, Program 1 Due (Tuesday, 10-03 at midnight)
Class 6
10/09/2017
  • Preprocessing
  • Assignment: Program 2 Due (Tuesday, 10-10 at midnight)
Fall Break
10/16/2017
No class session
Class 7
10/23/2017
  • Kaggle Datasets
  • Evaluating Neural Network Performance
Class 8
10/30/2017
  • Convolutional Neural Networks
  • Assignment: Take Home Midterm Due (Tuesday, 10-31 at midnight)
Class 9
11/06/2017
  • Regularization and Dropout
  • Assignment: Read Chapter 12
Class 10
11/13/2017
  • Timeseries and Recurrent Neural Networks
  • Assignment: Read Chapter 13
Class 11
11/20/2017
  • Natural Language Processing
  • Assignment: Read Chapter 14, Program 3 Due (Tuesday, 11-21 at midnight) (submitted on Kaggle.com)
Class 12
11/27/2017
  • Applications of Neural Networks
Class 13
12/04/2017
  • Advanced Deep Learning Topics
  • Assignment: Program 4 Due (Tuesday, 12-05 at midnight)
Class 14
12/11/2017
  • GPU, HPC and Cloud
  • Assignment: Final Project Due (Tuesday, 12-18 at midnight)

Datasets

  • Iris - Classify between 3 iris species.
  • Auto MPG - Regression to determine MPG.
  • WC Breast Cancer - Binary classification: malignant or benign.
  • toy1 - The toy1 dataset, regression for weights of geometric solids.

*Note: Other datasets will be added as the class progresses.

Assignments

Other Information

About

Washington University (in St. Louis) Course T81-558: Applications of Deep Neural Networks

Resources

License

Stars

Watchers

Forks

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%