# Loss functions

##### Clone this wiki locally

Given a prediction (p) and a label (y), a loss function measures the discrepancy between the algorithm's prediction and the desired output. VW currently supports the following loss functions, with squared loss being the default:

Loss Function Minimizer Example usage
Squared Expectation (mean) Regression
Expected return on stock
Quantile Median Regression
What is a typical price for a house?
Logistic Probability Classification
Hinge 0-1 approximation Classification
Is the digit a 7?
Classic Squared loss without
• If the problem is a binary classification (i.e. labels are -1 and +1) your choices should be Logistic or Hinge loss (although Squared loss may work as well). If you want VW to report the 0-1 loss instead of the logistic/hinge loss, add --binary. Example: spam vs non-spam, odds of click vs no-click.
• For binary classification where you need to know the posterior probabilities, use --loss_function logistic --link logistic.