- Building basic functions with numpy
- Sigmoid function, np.exp()
- Sigmoid gradient
- Reshaping arrays
- Normalizing rows
- Broadcasting and the softmax function
- Vectorization
- Implement the L1 and L2 loss functions
- Packages
- Overview of the Problem set
- General Architecture of the learning algorithm
- Building the parts of our algorithm
- Helper functions
- Initializing parameters
- Forward and Backward propagation
- Merge all functions into a model
- Further analysis
- Test with your own image
exp_1_3_Planar data classification with a hidden layer
- Packages
- Dataset
- Simple Logistic Regression
- Neural Network model
- Defining the neural network structure
- Initialize the model's parameters
- The Loop
- Integrate parts 4.1, 4.2 and 4.3 in nn_model()
- Predictions
- Tuning hidden layer size
- Performance on other datasets
- Packages
- Outline of the Assignment
- Initialization
- 2-layer Neural Network
- L-layer Neural Network
- Forward propagation module
- Linear Forward
- Linear-Activation Forward
- Cost function
- Backward propagation module
- Linear backward
- Linear-Activation backward
- L-Model Backward
- Update Parameters
- Conclusion
- Packages
- Dataset
- Architecture of your model
- 2-layer neural network
- L-layer deep neural network
- General methodology
- Two-layer neural network
- L-layer Neural Network
- Results Analysis
- Test with your own image
- Neural Network model
- Zero initialization
- Random initialization
- He initialization
- Conclusions
- Non-regularized model
- L2 Regularization
- Dropout
- Forward propagation with dropout
- Backward propagation with dropout
- Conclusions
- How does gradient checking work?
- 1-dimensional gradient checking
- N-dimensional gradient checking
- Gradient Descent
- Mini-Batch Gradient descent
- Momentum
- Adam
- Model with different optimization algorithms
- Mini-batch Gradient descent
- Mini-batch gradient descent with momentum
- Mini-batch with Adam mode
- Summary
-
Exploring the Tensorflow Library
- Linear function
- Computing the sigmoid
- Computing the Cost
- Using One Hot encodings
- Initialize with zeros and ones
-
Building your first neural network in tensorflow
- Problem statement: SIGNS Dataset
- Create placeholders
- Initializing the parameters
- Forward propagation in tensorflow
- Compute cost
- Backward propagation & parameter updates
- Building the model
- Test with your own image