Skip to content

A very simple Multiple Linear Regression (MLR) algorithm from Scratch. I did not use Scikit-Learn or any similar libraries

License

Notifications You must be signed in to change notification settings

AboNady/Multiple_Linear_Regression_From_Scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

11 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Multiple Linear Regression From Scratch


Details

  • A very simple Multiple Linear Regression (MLR) algorithm from Scratch. I did not use Scikit-Learn or any similar libraries.
    The main point from this is to understand how the linear and multivariable regression work in thr backgroud. Understand the math and the concept of it is much important using a library with 2 lines to train the model! At least for a beginner like me :)

  • I have used dataset found on Kaggle or Github, it is very simple! 2 features with 1 output as it is a house prediction problem. The concept remains the same. However, with diferent features' number you will have to edit Equation part in the code but the rest are the same. I used Mean Square Error (MSE). Also, I uploaded the 3 graphs, the first Feature with the output, the second feature with the output ,and finally, the loss function or the error vs iterations.

  • When you change your dataset and change the number of iterations and alpha value you might find the output as NaN or an Overflow Error. To solve this, do a Normalization for the dataset or multiply the dataset by 0.01 for example (This depends on the dataset). However, this may solve the problem.

  • For all the equation I used, why it is like this? How could you derive them also? Please check the References.


Figures

Tech Stack

  • Python: Version 3.10

  • Spyder IDE: Version 5.3.2

  • Pandas: Version 1.4.3

  • Matplotlib: Version 3.5.3

Contributing

Contributions are what makes the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.

If you have a suggestion that would make this better, please fork the repo and create a pull request. You can also simply open an issue with the tag "enhancement". Do not forget to give the project a star! Thanks again!


License

Distributed under the MIT License. See LICENSE.txt for more information.

References

  • This is important Article, as it illustrates how to deal the partial dervatives.

  • Very explained resource on which I depended a lot to develop this algorithm Helpful Video

Contacts