Skip to content

Om-codex/Machine_Learning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

🧠 Machine Learning Repository

Welcome to my Machine Learning Learning Journey! 🚀

This repository documents my progress — from the mathematical foundations of ML algorithms to full-fledged implementations. It includes from-scratch derivations, Scikit-learn applications, and clear visualizations for better understanding.


📘 Repository Structure

🔹 Core Chapters

  • Chapter1, Chapter2, Chapter3 — Foundational concepts, theory, and introductory experiments.

📊 Regression Algorithms

  • Linear Regression
    • Implemented using Normal Equation, Scikit-learn, and custom Gradient Descent.
  • Multiple Linear Regression
    • Extension of Linear Regression with multiple features.
  • Polynomial Regression
    • Demonstrating underfitting vs. overfitting.
  • Ridge Regression, Lasso, Elastic Net
    • Regularization techniques to handle multicollinearity and prevent overfitting.

🧮 Optimization Techniques

  • Gradient Descent
    • Batch, Stochastic, and Mini-Batch implementations.
    • Detailed analysis of convergence and learning rates.

🎯 Classification Algorithms

  • Logistic Regression
    • Binary and multiclass classification.
  • Naive Bayes
    • Probabilistic approach for categorical and text data.
  • Support Vector Machines (SVMs)
    • Linear and RBF kernel-based models.
  • Decision Trees
    • Visualization and interpretability.
  • PCA (Principal Component Analysis)
    • Dimensionality reduction and visualization of feature spaces.

🌲 Ensemble Learning

  • Bagging Ensemble
    • Combining weak learners to reduce variance.
  • Voting Ensemble
    • Hard and soft voting classifiers.
  • Gradient Boosting
    • Step-by-step implementation on the Iris dataset (gradient-boosting-classifier-on-iris.ipynb).
  • (Upcoming) AdaBoost, XGBoost, and Random Forests.

💡 Key Concepts Covered

  • Data Preprocessing: Scaling, encoding, handling missing values.
  • Model Evaluation: R², Accuracy, Precision, Recall, F1-score, ROC-AUC.
  • Bias–Variance Tradeoff: Theory and practical illustrations.
  • Regularization & Optimization: Ridge, Lasso, Elastic Net, Gradient Descent.
  • Dimensionality Reduction & Feature Selection.

🛠️ Technologies Used

  • Language: Python 🐍
  • Libraries:
    NumPy, Pandas, Matplotlib, Seaborn, Scikit-learn, mlxtend, and TensorFlow (for datasets)

📚 Learning Focus

Each folder contains:

  • Jupyter notebooks explaining the math intuition 👨‍🏫
  • Implementation from scratch and via Scikit-learn ⚙️
  • Visualizations and model evaluations 📈

If you find this repository helpful, feel free to star it!
Every star helps support continued learning and sharing 😊

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published