Skip to content

Latest commit

 

History

History
36 lines (23 loc) · 1.59 KB

File metadata and controls

36 lines (23 loc) · 1.59 KB

Logistic Regression

Equation:

$$\hat{y} = \frac{1}{1 + e^{-z}}$$

where:

  • $z = \theta^T x = \theta_0 + \theta_1 x_1 + \theta_2 x_2 + ... + \theta_n x_n$

Trap

Many machine learning experts actually label logistic regression as a classification method (it is not).

Logistic regression estimates the probability of an event occurring, such as voted or didn't vote, based on a given dataset of independent variables. Since the outcome is a probability, the dependent variable is bounded between 0 and 1.

The decision boundary for a logistic classifier is linear. (The classifier needs the inputs to be linearly separable.)

More