Skip to content

Jkovv/MLPRegressor

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 

Repository files navigation

OLS vs MLPRegressor

This project compares the performance of a classical statistical model (Ordinary Least Squares regression) with a neural network approach (MLPRegressor) on a supervised regression task.

The notebook demonstrates differences in predictive capability, model flexibility, and error performance between linear and non-linear methods.


Project Overview

The analysis focuses on evaluating how well a linear model (OLS) performs relative to a Multi-Layer Perceptron (MLP) regressor when modeling relationships within a dataset.

Key components include:

  • Data preprocessing
  • Feature selection and scaling
  • OLS regression modeling
  • Neural network training (MLPRegressor)
  • Prediction comparison
  • Error metric evaluation
  • Visualization of model performance

Models Compared

Ordinary Least Squares (OLS)

OLS is a linear regression technique that estimates coefficients by minimizing the sum of squared residuals between predicted and observed values.

Characteristics:

  • Assumes linear relationships
  • Interpretable coefficients
  • Sensitive to multicollinearity
  • Limited in modeling non-linear patterns

MLPRegressor

MLPRegressor is a feedforward artificial neural network implemented in scikit-learn.

Characteristics:

  • Captures non-linear relationships
  • Uses hidden layers and activation functions
  • Requires hyperparameter tuning
  • Less interpretable than OLS

Methodology

1. Data Preparation

  • Load dataset
  • Handle missing values
  • Define features (X) and target (y)
  • Train/test split

2. Feature Scaling

Neural networks require normalized inputs, so scaling (e.g., StandardScaler) is applied where appropriate.

3. OLS Modeling

  • Fit OLS regression model
  • Examine coefficients and statistical summary
  • Generate predictions

4. MLP Training

  • Define network architecture
  • Train MLPRegressor
  • Tune parameters such as:
    • Hidden layer sizes
    • Learning rate
    • Activation function
    • Iterations

5. Evaluation

Models are compared using regression metrics such as:

  • Mean Squared Error (MSE)
  • Root Mean Squared Error (RMSE)
  • R² score

6. Visualization

  • Predicted vs actual values
  • Residual plots
  • Model fit comparison

Tech Stack

  • Python
  • Jupyter Notebook
  • pandas
  • numpy
  • matplotlib / seaborn
  • scikit-learn
  • statsmodels

How to Run

# Clone the repository
git clone https://github.com/Jkovv/MLPRegressor.git
cd MLPRegressor

# Install dependencies
pip install pandas numpy matplotlib seaborn scikit-learn statsmodels notebook

# Launch the notebook
jupyter notebook OLS_vs_MLPregressor.ipynb

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors