This repository holds materials and notes from the DeepLearning.AI TensorFlow Developer Professional Certificate from Coursera.
Notes
- Week 1: What is AI?
- Week 2: Building AI Projects
- Week 3: Building AI in Your Company
- Week 4: AI & Society
Course 1: Introduction to Tensorflow for Artificial Intelligence, Machine Learning, and Deep Learning
Learning objectives of the first week:
Learning Objectives
- Monitor the accuracy of the housing price predictions
- Analyze housing price predictions that come from a single layer neural network
- Use TensorFlow to build a single layer neural network for fitting linear models
Notes
The notes of the content covered in the first week can be accessed here: first week notes
Jupyter Notebooks
- Use callback functions for tracking model loss and accuracy during training
- Make predictions on how the layer size affects network predictions and training speed
- Implement pixel value normalization to speed up network training
- Build a multilayer neural network for classifying the Fashion MNIST image dataset
Notes
The notes of the content covered in the second week can be accessed here: second week notes
Responsible AI Practices
The development of AI is creating new opportunities to improve the lives of people around the world, from business to healthcare to education. It is also raising new questions about the best way to build fairness, interpretability, privacy, and security into these systems.
Jupyter Notebooks
- Get hands-on with computer vision
- Using Callbacks to Control Training
- Assignment 2: Implementing Callbacks in TensorFlow using the MNIST Dataset
Learning Objectives
- Use callback functions to interrupt training after meeting a threshold accuracy
- Test the effect of adding convolution and MaxPooling to the neural network for classifying Fashion MNIST images on classification accuracy
- Explain and visualize how convolution and MaxPooling aid in image classification tasks
Notes
The notes of the content covered in the third week can be accessed here: third week notes
Jupyter Notebooks
- Improving Computer Vision Accuracy using Convolutions
- Exploring Convolutions
- Assignment 3: Improve MNIST with Convolutions
Learning Objectives
- Reflect on the possible shortcomings of your binary classification model implementation
- Execute image preprocessing with the Keras ImageDataGenerator functionality
- Carry out real life image classification by leveraging a multilayer neural network for binary classification
Notes
The notes of the content covered in the fourth week can be accessed here: fourth week notes
Jupyter Notebooks
- Training with ImageDataGenerator
- ImageDataGenerator with a Validation Set
- Effect of Compacted Images in Training
- Assignment 4: Handling Complex Images
In this course we'll go deeper into using ConvNets will real-world data, and learn about techniques that you can use to improve your ConvNet performance, particularly when doing image classification!
Learning Objectives
- Gain understanding about Keras’ utilities for pre-processing image data, in particular the ImageDataGenerator class
- Develop helper functions to move files around the filesystem so that they can be fed to the ImageDataGenerator
- Learn how to plot training and validation accuracies to evaluate model performance
- Build a classifier using convolutional neural networks for performing cats vs dogs classification
Data
Dogs vs Cats Dataset from Kaagle
Notes
Jupyter Notebooks
- Using more sophisticated images with Convolutional Neural Networks
- Assignment1 : Using CNN's with the Cats vs Dogs Dataset
Overfitting is simply the concept of being over specialized in training -- namely that your model is very good at classifying what it is trained for, but not so good at classifying things that it hasn't seen. In order to generalize your model more effectively, you will of course need a greater breadth of samples to train it on. That's not always possible, but a nice potential shortcut to this is Image Augmentation, where you tweak the training set to potentially increase the diversity of subjects it covers.
Learning Objectives
- Recognize the impact of adding image augmentation to the training process, particularly in time
- Demonstrate overfitting or lack of by plotting training and validation accuracies
- Familiarize with the ImageDataGenerator parameters used for carrying out image augmentation
- Learn how to mitigate overfitting by using data augmentation techniques
Notes
- To know more about augmentation
- Data augmentation can solve overfitting in Cats vs Dogs datasets. But applied the dataset of Humans vs Horses, it doesn't perform well. The reason could be that data augmentation do not revognise the features in the validation set.
Data
Jupyter Notebooks
- Data Augmentation
- Data Augmentation on the Horses or Humans Dataset
- Tackle Overfitting with Data Augmentation
Learning Objective
- Master the keras layer type known as dropout to avoid overfitting
- Achieve transfer learning in code using the keras API
- Code a model that implements Keras’ functional API instead of the commonly used Sequential model
- Learn how to freeze layers from an existing model to successfully implement transfer learning
- Explore the concept of transfer learning to use the convolutions learned by a different model from a larger dataset
Notes
- Understanding Dropout
- Transfer Learning and Dropout
- Inception architecture for Computer Vision
- Pretrained weights
Jupyter Notebooks
Learning Objectives
- Build a multiclass classifier for the Sign Language MNIST dataset
- Learn how to properly set up the ImageDataGenerator parameters and the model definition functions for multiclass classification
- Understand the difference between using actual image files vs images encoded in other formats and how this changes the methods available when using ImageDataGenerator
- Code a helper function to parse a raw CSV file which contains the information of the pixel values for the images used
Data
Jupyter Notebooks
In this course we spent a lot of time on Convolutional Neural Networks:
-
Exploring how to use them with large datasets
-
Taking advantage of augmentation, dropout, regularization, and transfer learning
-
Looking at the coding considerations between binary or multi-class classification
Learning Objectives
The first step in understanding sentiment in text, and in particular when training a neural network to do so is the tokenization of that text. This is the process of converting the text into numeric values, with a number representing a word or a character. We will learn about the Tokenizer and pad_sequences APIs in TensorFlow and how they can be used to prepare and encode text and sentences to get them ready for training neural networks!
Data
- News headlines dataset for sarcasm detection: Kaagle
- Download the Dataset
- BBC News Classification Dataset
Jupyter Notebooks