Skip to content

Visualized logistic, hinge, and squared losses to compare optimization behavior. Demonstrated how curvature, smoothness, and margins influence model learning and convergence, linking each loss to real-world algorithms such as SVMs and logistic regression.

Notifications You must be signed in to change notification settings

Joe-Naz01/loss_functions

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 

Repository files navigation

Loss Functions — Understanding Model Optimization

Problem. Different learning objectives use distinct loss functions that shape how models learn and generalize. This notebook demonstrates and visualizes key loss functions used in classification and regression.

Dataset Boston Housing Prices

Approach.

  • Implemented Logistic Loss, Hinge Loss, and Squared Error manually and with scikit-learn.
  • Visualized how each loss penalizes misclassification or residual errors.
  • Compared gradient behaviors and optimization landscapes.
  • Discussed the effect of smoothness and margins on convergence.
  • Linked loss selection to model families (e.g., Logistic → LogisticRegression, Hinge → SVM).

Results (qualitative).

  • Logistic loss provides smoother gradients for probabilistic outputs.
  • Hinge loss enforces a margin, making it suitable for SVMs.
  • Squared error can struggle with classification due to non-convexity in discrete outputs.

What I Learned.

  • How loss curvature affects gradient descent stability.
  • Why logistic and hinge losses differ in robustness to outliers.
  • How visualization clarifies optimization trade-offs.

Quick Start

git clone https://github.com/Joe-Naz01/loss_functions.git
cd loss_functions

python -m venv .venv
source .venv/bin/activate   # Windows: .venv\Scripts\activate
pip install -r requirements.txt
jupyter notebook

About

Visualized logistic, hinge, and squared losses to compare optimization behavior. Demonstrated how curvature, smoothness, and margins influence model learning and convergence, linking each loss to real-world algorithms such as SVMs and logistic regression.

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published