Skip to content
This repository has been archived by the owner on May 12, 2019. It is now read-only.

Latest commit

 

History

History
31 lines (19 loc) · 1.27 KB

index.rst

File metadata and controls

31 lines (19 loc) · 1.27 KB

RegERMs.jl

Regularized empirical risk minimization (RegERM) is a general concept that defines a family of optimization problems in machine learning, as, e.g., Support Vector Machine, Logistic Regression, and Ridge Regression.

Contents:

api.rst methods.rst

Let ${\bf x}_i$ be a vector of features describing an instance i and yi be its target value. Then, for a given set of n training instances $\{({\bf x}_i,y_i)\}_{i=1}^n$ the goal is to find a model ${\bf w}$ that minimizes the regularized empirical risk:

$$\sum_{i=1}^n \ell({\bf w}, {\bf x}_i, y_i) + \Omega({\bf w}).$$

The loss function measures the disagreement between the true label y and the model prediction and the regularizer Ω penalizes the model's complexity.

optimize(method::RegERM, λ::Float64, optimizer::Symbol=:l_bfgs)

Perform the optimization of method for a given regularization parameter λ and return a prediction model that can be used for classification. Stochastic gradient descent (:svg) and Limited-memory BFGS (:l_bfgs) are valid optimizer.

Indices and tables

  • genindex
  • modindex
  • search