In this repository describe regression,linear regression ,types of regression,regression metrics and scaleing Regression, on the other hand, involves predicting a continuous-valued output based on input features. Simple linear regression is a statistical method used to model the relationship between a single independent variable (predictor) and a dependent variable (response). It assumes that there is a linear relationship between the independent variable 𝑋 X and the dependent variable 𝑌 Y. The goal is to find the best-fitting straight line (linear regression line) that minimizes the sum of squared differences between the observed values of 𝑌 Y and the values predicted by the linear model.
The equation for simple linear regression is given by:
y=a+bx Mean Squared Error (MSE) is a commonly used metric to evaluate the performance of a regression model. It measures the average of the squared differences between the predicted values and the actual values Regression metrics are used to evaluate the performance of regression models, which are models that predict continuous numerical values. These metrics quantify how well the predicted values from the regression model match the actual values in the dataset. Here are some commonly used regression metrics:
Mean Squared Error (MSE): MSE is the average of the squared differences between the predicted values and the actual values. It penalizes larger errors more heavily than smaller errors due to the squaring operation.
Lower MSE values indicate better model performance. Root Mean Squared Error (RMSE): RMSE is the square root of the MSE, which provides a measure of the average magnitude of the errors in the same units as the dependent variable.
Like MSE, lower RMSE values indicate better model performance. Mean Absolute Error (MAE): MAE is the average of the absolute differences between the predicted values and the actual values. It provides a measure of the average magnitude of the errors, regardless of their direction. Lower MAE values indicate better model performance. R-squared (R²) Score: R-squared measures the proportion of the variance in the dependent variable that is explained by the independent variables in the model. It ranges from 0 to 1, where 1 indicates perfect prediction and 0 indicates that the model does not explain any of the variance in the dependent variable. Higher R-squared values indicate better model performance. Adjusted R-squared (Adjusted R²): Adjusted R-squared is a modified version of R-squared that adjusts for the number of independent variables in the model. It penalizes the addition of unnecessary variables to the model, preventing overfitting. Adjusted R-squared is particularly useful when comparing models with different numbers of predictors.