A collection of notes and code implementations for understanding logistic regression and multiclass classification.
-
Logistic_regress.ipynb- Binary logistic regression fundamentals- Linear equation and sigmoid function
- Log loss (binary cross-entropy)
- Gradient descent optimization
-
multiclass_logistic_regression.md- Multiclass classification approaches- One-vs-Rest (OvR)
- Softmax function
- Comparison between approaches
logistic_regression.py- Simple binary classifier- Classifies points into two classes (0 or 1)
- Uses gradient descent to optimize parameters
- Demonstrates sigmoid function and log loss
-
multiclass.py- One-vs-Rest implementation- Example: Classifying animals (cat, dog, bird, fish)
- Feature normalization
- Binary classifier for one class (cats)
-
softmax.py- Softmax implementation- Example: Classifying animals (cat, bird, fish)
- One-hot encoding of labels
- Categorical cross-entropy loss
- Predicts all classes simultaneously