# ossamaazzaz / machine_learning_coursera

This repository summaries my progress and contains my solutions for the given programming assignments while taking Andrew Ng's course on Coursera about machine learning.
MATLAB

## Latest commit Fetching latest commit…
Cannot retrieve the latest commit at this time.

## Files

Type Name Latest commit message Commit time
Failed to load latest commit information. All exercises ex1_week2 ex2_week3 pdfs .gitignore README.md

# Machine learning - Coursera

This repository summaries my progress and contains my solutions for the given programming assignments while taking Andrew Ng's course on Coursera about machine learning.

``````PS: The course includes tutorials about Octave, so all the suggested solutions are implemented using Octave.
``````

The course takes about 11 weeks to complete. And it is about how to apply the most advanced machine learning algorithms to such problems as anti-spam, image recognition, clustering, building recommender systems, and many other problems. It also covers how to select the right algorithm for the right job, as well as ‘debugging’ and figuring out how to improve a learning algorithm's performance.

# Weeks summary:

## week 1:

During this week, I have got to know more about what machine learning is and its two learning types (supervised and unsupervised). I have also seen how linear regression predicts a real-valued output based on an input value. This week discusses the application of linear regression to housing price prediction, presents the notion of a cost function, and introduces the gradient descent method for learning. There also another optional module in this week. It provides a refresher on linear algebra concepts which is necessary for the rest of the course.

## week 2:

After completing week 1 and getting to know linear regression, a question must be asked: "What if the input has more than one value?". This chapter contains a module that responds to the question and shows how linear regression can be extended to accommodate multiple input features. It also discusses best practices for implementing linear regression. This week includes another module which is about the real implementation using programming. The module introduces Octave/Matlab basics, how to manipulate data using Octave and shows how to submit an assignment. Finally, by the end of this week, there is a programming assignment that requires using Octave, we'll be talking about it in detail in the next section of this file.

## week 3:

In this week, I've got an introduction to the notion of classification, the cost function for logistic regression, and the application of logistic regression to multi-class classification. As well as the overfitting problem and regularization, which helps prevent models from overfitting the training data.

# Programming assignments:

You can find the documents of all the programming assignments inside the "pdfs" folder, and you can find all the resources and the files needed for each assignment inside the "All exercises" folder in this repository. You will find a solution for each assignment in a separate folder at the root of this repository.

## ex1_week2:

As told in the summary of week 2, this week ends by an exercise. In this exercise, I have implemented linear regression and got to see it work on data. I have applied what I learned in the last module of this week: plotting and visualizing data, fitting the linear regression parameters θ to the dataset using gradient descent, computing the coast and debugging.

## ex2_week3:

This programing assignement is about implementing logistic regression and applying it to two different datasets.

## ex3_week4:

This programing assignment is about implementing logistic regression and applying it to two different datasets. First of all, I wrote the sigmoid function, then I wrote both cost and gradient functions for logistic regression. After that, I applied regularization on the previous example to prevent my model from overfitting the training data.

## ex7_week8:

You can’t perform that action at this time.