Programming assignments of the Deep Learning Specialization on Coursera.
In the first course of the Deep Learning Specialization, you will study the foundational concept of neural networks and deep learning.
By the end, you will be familiar with the significant technological trends driving the rise of deep learning; build, train, and apply fully connected deep neural networks; implement efficient (vectorized) neural networks; identify key parameters in a neural network’s architecture; and apply deep learning to your own applications.
- W2A1: Python Basics with Numpy
- W2A2: Logistic Regression with a Neural Network mindset
- W3A1: Planar data classification with one hidden layer
- W4A1: Building your Deep Neural Network: Step by Step
- W4A2: Deep Neural Network for Image Classification: Application
In the second course of the Deep Learning Specialization, you will open the deep learning black box to understand the processes that drive performance and generate good results systematically.
By the end, you will learn the best practices to train and develop test sets and analyze bias/variance for building deep learning applications; be able to use standard neural network techniques such as initialization, L2 and dropout regularization, hyperparameter tuning, batch normalization, and gradient checking; implement and apply a variety of optimization algorithms, such as mini-batch gradient descent, Momentum, RMSprop and Adam, and check for their convergence; and implement a neural network in TensorFlow.
- W1A1: Initialization
- W1A2: Regularization
- W1A3: Gradient Checking
- W2A1: Optimization Methods
- W3A1: TensorFlow Tutorial
In the third course of the Deep Learning Specialization, you will learn how to build a successful machine learning project and get to practice decision-making as a machine learning project leader.
By the end, you will be able to diagnose errors in a machine learning system; prioritize strategies for reducing errors; understand complex ML settings, such as mismatched training/test sets, and comparing to and/or surpassing human-level performance; and apply end-to-end learning, transfer learning, and multi-task learning.
This is also a standalone course for learners who have basic machine learning knowledge. This course draws on Andrew Ng’s experience building and shipping many deep learning products. If you aspire to become a technical leader who can set the direction for an AI team, this course provides the "industry experience" that you might otherwise get only after years of ML work experience.
In the fourth course of the Deep Learning Specialization, you will understand how computer vision has evolved and become familiar with its exciting applications such as autonomous driving, face recognition, reading radiology images, and more.
By the end, you will be able to build a convolutional neural network, including recent variations such as residual networks; apply convolutional networks to visual detection and recognition tasks; and use neural style transfer to generate art and apply these algorithms to a variety of image, video, and other 2D or 3D data.
- W1A1: Convolutional Model: step by step
- W1A2: Convolutional Neural Networks: Application
- W2A1: Residual Networks
- W2A2: Transfer Learning with MobileNetV2
- W3A1: Object detection with YOLO
- W3A2: Image Segmentation with U-Net
- W4A1: Face Recognition
- W4A2: Deep Learning & Art: Neural Style Transfer
In the fifth course of the Deep Learning Specialization, you will become familiar with sequence models and their exciting applications such as speech recognition, music synthesis, chatbots, machine translation, natural language processing (NLP), and more.
By the end, you will be able to build and train Recurrent Neural Networks (RNNs) and commonly-used variants such as GRUs and LSTMs; apply RNNs to Character-level Language Modeling; gain experience with natural language processing and Word Embeddings; and use HuggingFace tokenizers and transformer models to solve different NLP tasks such as NER and Question Answering.
- W1A1: Building a Recurrent Neural Network - Step by Step
- W1A2: Character-level language model
- W1A3: Jazz improvisation with LSTM
- W2A1: Word Vector Representation and Debiasing
- W2A2: Emojify!
- W3A1: Neural Machine Translation with Attention
- W3A2: Trigger Word Detection
- W4A1: Transformer Network