Skip to content

Dhina000/Linear_Regression_ML

Repository files navigation

Linear Regression models

1)SIMPLE LINEAR REGRESSION:

          Simple linear regression is a statistical method used to understand the relationship between two continuous variables. It assumes that there is a linear relationship between the independent variable 

(predictor) and the dependent variable (outcome). The goal is to fit a straight line to the data points that minimizes the difference between the observed values and the values predicted by the line. This line is characterized by its slope and intercept. The slope represents the change in the dependent variable for a one-unit change in the independent variable, while the intercept is the value of the dependent variable when the independent variable is zero. Simple linear regression is commonly used for prediction and inference, providing insights into how changes in one variable affect another. The formula for a simple linear regression is: Y=b+mX

2)MULTIPLE LINEAR REGRESSION:

          Multiple Linear Regression (MLR) is a statistical technique that uses two or more explanatory (independent) variables to predict the outcome of a response (dependent) variable1. The goal of MLR is to 

model the linear relationship between the explanatory variables and the response variable1. The formula for a multiple linear regression is: Y=b0​+b1​X1​+b2​X2​+b3​X3​+...+bn​Xn​

3)POLYNOMIAL LINEAR REGRESSION:

      Polynomial regression is a form of Linear regression where only due to the Non-linear relationship between dependent and independent variables, we add some polynomial terms to linear regression to convert 

it into Polynomial regression.In polynomial regression, the relationship between the dependent variable and the independent variable is modeled as an nth-degree polynomial function. When the polynomial is of degree 2, it is called a quadratic model; when the degree of a polynomial is 3, it is called a cubic model, and so on. The formula is : y = a0 + a1x1 + a2x12 + … + anx1n

4)SUPPORT VECTOR REGRESSION:

       Support Vector Regression (SVR) is a type of machine learning algorithm used for regression analysis. The goal of SVR is to find a function that approximates the relationship between the input variables and a continuous target variable, while minimizing the prediction error.

Unlike Support Vector Machines (SVMs) used for classification tasks, SVR seeks to find a hyperplane that best fits the data points in a continuous space. This is achieved by mapping the input variables to a high-dimensional feature space and finding the hyperplane that maximizes the margin (distance) between the hyperplane and the closest data points, while also minimizing the prediction error.

5)DECISION TREE REGRESSION:

                     Decision tree regression observes features of an object and trains a model in the structure of a tree to predict data in the future to produce meaningful continuous output. Continuous output means that the output/result is not discrete, i.e., it is not represented just by a discrete, known set of numbers or values.
                     Discrete output example: A weather prediction model that predicts whether or not there’ll be rain on a particular day. 
                     Continuous output example: A profit prediction model that states the probable profit that can be generated from the sale of a product.

6)RANDOM FOREST REGRESSION:

           Random Forest Regression is a versatile machine-learning technique for predicting numerical values. It combines the predictions of multiple decision trees to reduce overfitting and improve accuracy. Python’s machine-learning libraries make it easy to implement and optimize this approach.A random forest is an ensemble learning method that combines the predictions from multiple decision trees to produce a more accurate and stable prediction. It is a type of supervised learning algorithm that can be used for both classification and regression tasks.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published