Skip to content

Commit

Permalink
avoid URL redirect
Browse files Browse the repository at this point in the history
  • Loading branch information
mauvaisetroupe committed Oct 1, 2023
1 parent 2273e12 commit 73bc706
Show file tree
Hide file tree
Showing 14 changed files with 51 additions and 51 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -30,7 +30,7 @@ Explication de la notion de
- Gradient descent
- Logistic regression

Notion expliquée dans la formation [Machine Learning de Coursera](/deeplearning/machine-learning-specialization/c1-supervised-ml/week1).
Notion expliquée dans la formation [Machine Learning de Coursera](/deeplearning/machine-learning-specialization/c1-supervised-ml/week1/).

Une comparaison intéressante pour parler de l'Overfiting (surapprentissage) : plus un étudiant bosse ses exercices, mieux il saura les faire. Mais que ce passe-t-il avec les exercices de l'examen? La solution est donc de ne pas avoir une apprentissage trop poussé sur les exercices d’entraînement et/ou d'augmenter le nombre d’exercices de la phase d’entraînement.

Expand Down
38 changes: 19 additions & 19 deletions content/en/deeplearning/deep-learning-specialization/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@ In this Specialization, you will build and train neural network architectures su
AI is transforming many industries. The Deep Learning Specialization provides a pathway for you to take the definitive step in the world of AI by helping you gain the knowledge and skills to level up your career. Along the way, you will also get career advice from deep learning experts from industry and academia.


## [Course 1 - Neural Networks and Deep Learning](./c1-neural-networks-and-deep-learning)
## [Course 1 - Neural Networks and Deep Learning](./c1-neural-networks-and-deep-learning/)

https://www.coursera.org/learn/neural-networks-deep-learning

Expand All @@ -35,15 +35,15 @@ SKILLS YOU WILL GAIN
- Python Programming
- Neural Network Architecture

#### [Week1 - Introduction to deep leatning](./c1-neural-networks-and-deep-learning/week1)
#### [Week1 - Introduction to deep leatning](./c1-neural-networks-and-deep-learning/week1/)

Analyze the major trends driving the rise of deep learning, and give examples of where and how it is applied today.

#### [Week 2 - Neural networks Basics](./c1-neural-networks-and-deep-learning/week2)
#### [Week 2 - Neural networks Basics](./c1-neural-networks-and-deep-learning/week2/)

Set up a machine learning problem with a neural network mindset and use vectorization to speed up your models.

#### [Week 3 - Shallow Neural Networks](./c1-neural-networks-and-deep-learning/week3)
#### [Week 3 - Shallow Neural Networks](./c1-neural-networks-and-deep-learning/week3/)

Build a neural network with one hidden layer, using forward propagation and backpropagation.

Expand All @@ -54,7 +54,7 @@ Analyze the key computations underlying deep learning, then use them to build an



## [Course 2 - Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization](./c2-improving-deep-neural-networks)
## [Course 2 - Improving Deep Neural Networks: Hyperparameter Tuning, Regularization and Optimization](./c2-improving-deep-neural-networks/)

https://www.coursera.org/learn/deep-neural-network

Expand All @@ -70,21 +70,21 @@ SKILLS YOU WILL GAIN



#### [Week 1 - Practical Aspects of Deep Learning](./c2-improving-deep-neural-networks/week1)
#### [Week 1 - Practical Aspects of Deep Learning](./c2-improving-deep-neural-networks/week1/)

Discover and experiment with a variety of different initialization methods, apply L2 regularization and dropout to avoid model overfitting, then apply gradient checking to identify errors in a fraud detection model.

#### [Week 2 - Optimization Algorithms](./c2-improving-deep-neural-networks/week2)
#### [Week 2 - Optimization Algorithms](./c2-improving-deep-neural-networks/week2/)

Develop your deep learning toolbox by adding more advanced optimizations, random minibatching, and learning rate decay scheduling to speed up your models.

#### [Week 3 - Hyperparameter Tuning, Batch Normalization and Programming Frameworks](./c2-improving-deep-neural-networks/week3)
#### [Week 3 - Hyperparameter Tuning, Batch Normalization and Programming Frameworks](./c2-improving-deep-neural-networks/week3/)

Explore TensorFlow, a deep learning framework that allows you to build neural networks quickly and easily, then train a neural network on a TensorFlow dataset.



# [Course 3 - Structuring Machine Learning Projects](./c3-structuring-ml-projects)
# [Course 3 - Structuring Machine Learning Projects](./c3-structuring-ml-projects/)

https://www.coursera.org/learn/machine-learning-projects

Expand All @@ -100,16 +100,16 @@ SKILLS YOU WILL GAIN
- Decision-Making


#### [Week 1 - ML strategy](./c3-structuring-ml-projects/week1)
#### [Week 1 - ML strategy](./c3-structuring-ml-projects/week1/)

Streamline and optimize your ML production workflow by implementing strategic guidelines for goal-setting and applying human-level performance to help define key priorities.

#### [Week 2 - ML strategy](./c3-structuring-ml-projects/week2)
#### [Week 2 - ML strategy](./c3-structuring-ml-projects/week2/)

Develop time-saving error analysis procedures to evaluate the most worthwhile options to pursue and gain intuition for how to split your data and when to use multi-task, transfer, and end-to-end deep learning.


# [Course 4 - Convolutional Neural Networks](./c4-convolutional-neural-netowrks)
# [Course 4 - Convolutional Neural Networks](./c4-convolutional-neural-netowrks/)

https://www.coursera.org/learn/convolutional-neural-networks

Expand All @@ -124,16 +124,16 @@ SKILLS YOU WILL GAIN
- Tensorflow
- Object Detection And Segmentation

#### [Week 1 - Foundations of Convolutional Neural Networks](./c4-convolutional-neural-netowrks/week1)
#### [Week 1 - Foundations of Convolutional Neural Networks](./c4-convolutional-neural-netowrks/week1/)

Implement the foundational layers of CNNs (pooling, convolutions) and stack them properly in a deep network to solve multi-class image classification problems.


#### [Week 2 - Deep Convolutional Models: Case Studies](./c4-convolutional-neural-netowrks/week2)
#### [Week 2 - Deep Convolutional Models: Case Studies](./c4-convolutional-neural-netowrks/week2/)

Discover some powerful practical tricks and methods used in deep CNNs, straight from the research papers, then apply transfer learning to your own deep CNN.

#### [Week 3 - Object Detection](./c4-convolutional-neural-netowrks/week3)
#### [Week 3 - Object Detection](./c4-convolutional-neural-netowrks/week3/)

Apply your new knowledge of CNNs to one of the hottest (and most challenging!) fields in computer vision: object detectio

Expand All @@ -142,7 +142,7 @@ Apply your new knowledge of CNNs to one of the hottest (and most challenging!) f

Explore how CNNs can be applied to multiple fields, including art generation and face recognition, then implement your own algorithm to generate art and recognize faces!

# [Course 5 - Sequence Models](./c5-recurrent-neural-networks)
# [Course 5 - Sequence Models](./c5-recurrent-neural-networks/)

https://www.coursera.org/learn/nlp-sequence-models

Expand All @@ -157,17 +157,17 @@ SKILLS YOU WILL GAIN
- Recurrent Neural Network
- Attention Models

#### [Week 1 - Recurrent Neural Networks](./c5-recurrent-neural-networks/week1)
#### [Week 1 - Recurrent Neural Networks](./c5-recurrent-neural-networks/week1/)

Discover recurrent neural networks, a type of model that performs extremely well on temporal data, and several of its variants, including LSTMs, GRUs and Bidirectional RNNs,


#### [Week 2 - Natural Language Processing & Word Embeddings](./c5-recurrent-neural-networks/week2)
#### [Week 2 - Natural Language Processing & Word Embeddings](./c5-recurrent-neural-networks/week2/)

Natural language processing with deep learning is a powerful combination. Using word vector representations and embedding layers, train recurrent neural networks with outstanding performance across a wide variety of applications, including sentiment analysis, named entity recognition and neural machine translation.


#### [Week 3 - Sequence Models & Attention Mechanism](./c5-recurrent-neural-networks/week3)
#### [Week 3 - Sequence Models & Attention Mechanism](./c5-recurrent-neural-networks/week3/)

Augment your sequence models using an attention mechanism, an algorithm that helps your model decide where to focus its attention given a sequence of inputs. Then, explore speech recognition and how to deal with audio data.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,7 +22,7 @@ Learning Objectives
### Why look at case studies?


[Last week](../week1) we learned about the basic building blocks, such as convolutional layers, pooling layers, and fully connected layers of convnet.
[Last week](../week1/) we learned about the basic building blocks, such as convolutional layers, pooling layers, and fully connected layers of convnet.

In the past few years, a lot of computer vision research has been done to put together these basic building blocks to form effective convolutional neural networks.

Expand Down
22 changes: 11 additions & 11 deletions content/en/deeplearning/machine-learning-specialization/_index.md
Original file line number Diff line number Diff line change
Expand Up @@ -43,7 +43,7 @@ In the first course of the Machine Learning Specialization, you will:
- Build machine learning models in Python using popular machine learning libraries NumPy and scikit-learn.
- Build and train supervised machine learning models for prediction and binary classification tasks, including linear regression and logistic regression

#### [Week 1: Introduction to Machine Learning](./c1-supervised-ml/week1)
#### [Week 1: Introduction to Machine Learning](./c1-supervised-ml/week1/)

Welcome to the Machine Learning Specialization! You're joining millions of others who have taken either this or the original course, which led to the founding of Coursera, and has helped millions of other learners, like you, take a look at the exciting world of machine learning!

Expand All @@ -57,7 +57,7 @@ Learning Objectives
- Implement gradient descent
- Optimize a regression model using gradient descent

#### [Week 2: Regression with multiple input variables](./c1-supervised-ml/week2)
#### [Week 2: Regression with multiple input variables](./c1-supervised-ml/week2/)

This week, you'll extend linear regression to handle multiple input features. You'll also learn some methods for improving your model's training and performance, such as vectorization, feature scaling, feature engineering and polynomial regression. At the end of the week, you'll get to practice implementing linear regression in code.

Expand All @@ -66,7 +66,7 @@ Learning Objectives
- Use feature scaling, feature engineering, and polynomial regression to improve model training
- Implement linear regression in code

#### [Week 3: Classification](./c1-supervised-ml/week3)
#### [Week 3: Classification](./c1-supervised-ml/week3/)

This week, you'll learn the other type of supervised learning, classification. You'll learn how to predict categories using the logistic regression model. You'll learn about the problem of overfitting, and how to handle this problem with a method called regularization. You'll get to practice implementing logistic regression with regularization at the end of this week!

Expand All @@ -76,7 +76,7 @@ Learning Objectives
- Address overfitting using regularization, to improve model performance


### [Advanced Learning Algorithms](./c2-advanced-learning-algorithms)
### [Advanced Learning Algorithms](./c2-advanced-learning-algorithms/)

In the second course of the Machine Learning Specialization, you will:

Expand All @@ -85,7 +85,7 @@ In the second course of the Machine Learning Specialization, you will:
- Build and use decision trees and tree ensemble methods, including random forests and boosted trees


#### [Week 1 - Neural Networks](./c2-advanced-learning-algorithms/week1)
#### [Week 1 - Neural Networks](./c2-advanced-learning-algorithms/week1/)

This week, you'll learn about neural networks and how to use them for classification tasks. You'll use the TensorFlow framework to build a neural network with just a few lines of code. Then, dive deeper by learning how to code up your own neural network in Python, "from scratch". Optionally, you can learn more about how neural network computations are implemented efficiently using parallel processing (vectorization).

Expand All @@ -100,7 +100,7 @@ Learning Objectives
- Build a neural network in regular Python code (from scratch) to make predictions.
- (Optional): Learn how neural networks use parallel processing (vectorization) to make computations faster.

#### [Week 2 - Neural Networks Training](./c2-advanced-learning-algorithms/week2)
#### [Week 2 - Neural Networks Training](./c2-advanced-learning-algorithms/week2/)

This week, you'll learn how to train your model in TensorFlow, and also learn about other important activation functions (besides the sigmoid function), and where to use each type in a neural network. You'll also learn how to go beyond binary classification to multiclass classification (3 or more categories). Multiclass classification will introduce you to a new activation function and a new loss function. Optionally, you can also learn about the difference between multiclass classification and multi-label classification. You'll learn about the Adam optimizer, and why it's an improvement upon regular gradient descent for neural network training. Finally, you will get a brief introduction to other layer types besides the one you've seen thus far.

Expand All @@ -115,7 +115,7 @@ Learning Objectives
- Use the recommended method for implementing multiclass classification in code
- (Optional): Explain the difference between multi-label and multiclass classification

#### [Week 3 - Advice for applying machine learning](./c2-advanced-learning-algorithms/week3)
#### [Week 3 - Advice for applying machine learning](./c2-advanced-learning-algorithms/week3/)

This week you'll learn best practices for training and evaluating your learning algorithms to improve performance. This will cover a wide range of useful advice about the machine learning lifecycle, tuning your model, and also improving your training data.

Expand Down Expand Up @@ -144,15 +144,15 @@ Learning Objectives
- Learn how to use multiple trees, "tree ensembles" such as random forests and boosted trees
- Learn when to use decision trees or neural networks

### [Unsupervised Learning, Recommenders, Reinforcement Learning](./c3-unsupervised-learning)
### [Unsupervised Learning, Recommenders, Reinforcement Learning](./c3-unsupervised-learning/)

In the third course of the Machine Learning Specialization, you will:

- Use unsupervised learning techniques for unsupervised learning: including clustering and anomaly detection.
- Build recommender systems with a collaborative filtering approach and a content-based deep learning method.
- Build a deep reinforcement learning model.

#### [Week 1 - Unsupervised learning](./c3-unsupervised-learning/week1)
#### [Week 1 - Unsupervised learning](./c3-unsupervised-learning/week1/)

This week, you will learn two key unsupervised learning algorithms: clustering and anomaly detection

Expand All @@ -167,14 +167,14 @@ Learning Objectives
- Implement the function that finds the closest centroids to each point in k-means


#### [Week 2 - Recommender systems](./c3-unsupervised-learning/week2)
#### [Week 2 - Recommender systems](./c3-unsupervised-learning/week2/)

Learning Objectives
- Implement collaborative filtering recommender systems in TensorFlow
- Implement deep learning content based filtering using a neural network in TensorFlow
- Understand ethical considerations in building recommender systems

#### [Week 3 - Reinforcement learning](./c3-unsupervised-learning/week3)
#### [Week 3 - Reinforcement learning](./c3-unsupervised-learning/week3/)

This week, you will learn about reinforcement learning, and build a deep Q-learning neural network in order to land a virtual lunar lander on Mars!

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,7 @@ In the first course of the Machine Learning Specialization, you will:

It provides a broad introduction to modern machine learning, including supervised learning (multiple linear regression, logistic regression, neural networks, and decision trees), unsupervised learning (clustering, dimensionality reduction, recommender systems), and some of the best practices used in Silicon Valley for artificial intelligence and machine learning innovation (evaluating and tuning models, taking a data-centric approach to improving performance, and more.)

### [Week 1: Introduction to Machine Learning](./week1)
### [Week 1: Introduction to Machine Learning](./week1/)

Welcome to the Machine Learning Specialization! You're joining millions of others who have taken either this or the original course, which led to the founding of Coursera, and has helped millions of other learners, like you, take a look at the exciting world of machine learning!

Expand All @@ -24,7 +24,7 @@ Learning Objectives
- Implement gradient descent
- Optimize a regression model using gradient descent

### [Week 2: Regression with multiple input variables](./week2)
### [Week 2: Regression with multiple input variables](./week2/)

This week, you'll extend linear regression to handle multiple input features. You'll also learn some methods for improving your model's training and performance, such as vectorization, feature scaling, feature engineering and polynomial regression. At the end of the week, you'll get to practice implementing linear regression in code.

Expand All @@ -34,7 +34,7 @@ Learning Objectives
- Implement linear regression in code


### [Week 3: Classification](./week3)
### [Week 3: Classification](./week3/)

This week, you'll learn the other type of supervised learning, classification. You'll learn how to predict categories using the logistic regression model. You'll learn about the problem of overfitting, and how to handle this problem with a method called regularization. You'll get to practice implementing logistic regression with regularization at the end of this week!

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -11,7 +11,7 @@ In the second course of the Machine Learning Specialization, you will:

It provides a broad introduction to modern machine learning, including supervised learning (multiple linear regression, logistic regression, neural networks, and decision trees), unsupervised learning (clustering, dimensionality reduction, recommender systems), and some of the best practices used in Silicon Valley for artificial intelligence and machine learning innovation (evaluating and tuning models, taking a data-centric approach to improving performance, and more.)

### [Week 1 - Neural Networks](./week1)
### [Week 1 - Neural Networks](./week1/)

This week, you'll learn about neural networks and how to use them for classification tasks. You'll use the TensorFlow framework to build a neural network with just a few lines of code. Then, dive deeper by learning how to code up your own neural network in Python, "from scratch". Optionally, you can learn more about how neural network computations are implemented efficiently using parallel processing (vectorization).

Expand All @@ -26,7 +26,7 @@ Learning Objectives
- Build a neural network in regular Python code (from scratch) to make predictions.
- (Optional): Learn how neural networks use parallel processing (vectorization) to make computations faster.

### [Week 2 - Neural Networks Training](./week2)
### [Week 2 - Neural Networks Training](./week2/)

This week, you'll learn how to train your model in TensorFlow, and also learn about other important activation functions (besides the sigmoid function), and where to use each type in a neural network. You'll also learn how to go beyond binary classification to multiclass classification (3 or more categories). Multiclass classification will introduce you to a new activation function and a new loss function. Optionally, you can also learn about the difference between multiclass classification and multi-label classification. You'll learn about the Adam optimizer, and why it's an improvement upon regular gradient descent for neural network training. Finally, you will get a brief introduction to other layer types besides the one you've seen thus far.

Expand All @@ -41,7 +41,7 @@ Learning Objectives
- Use the recommended method for implementing multiclass classification in code
- (Optional): Explain the difference between multi-label and multiclass classification

### [Week 3 - Advice for applying machine learning](./week3)
### [Week 3 - Advice for applying machine learning](./week3/)

This week you'll learn best practices for training and evaluating your learning algorithms to improve performance. This will cover a wide range of useful advice about the machine learning lifecycle, tuning your model, and also improving your training data.

Expand Down

0 comments on commit 73bc706

Please sign in to comment.