Skip to content

Commit

Permalink
Added description, and code for Huber loss. (#39)
Browse files Browse the repository at this point in the history
modified:   code/loss_functions.py
	modified:   docs/loss_functions.rst
  • Loading branch information
mviana authored and bfortuner committed Sep 12, 2018
1 parent 373bb9e commit 31739f2
Show file tree
Hide file tree
Showing 2 changed files with 13 additions and 3 deletions.
4 changes: 2 additions & 2 deletions code/loss_functions.py
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ def Hinge(yHat, y):
return np.max(0, 1 - yHat * y)


def Huber(yHat, y):
pass
def Huber(yHat, y, delta=1.):
return np.where(np.abs(y-yHat) < delta,.5*(y-yHat)**2 , delta*(np.abs(y-yHat)-0.5*delta))


def KLDivergence(yHat, y):
Expand Down
12 changes: 11 additions & 1 deletion docs/loss_functions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -68,13 +68,23 @@ Used for classification.
Huber
=====

Typically used for regression. It's less sensitive to outliers than the MSE.
Typically used for regression. It's less sensitive to outliers than the MSE as it treats error as square only inside an interval.

.. math::
L_{\delta}=\left\{\begin{matrix}
\frac{1}{2}(y - \hat{y})^{2} & if \left | (y - \hat{y}) \right | < \delta\\
\delta (y - \hat{y}) - \frac1 2 \delta & otherwise
\end{matrix}\right.
.. rubric:: Code

.. literalinclude:: ../code/loss_functions.py
:pyobject: Huber

Further information can be found at `Huber Loss in Wikipedia`_.

.. _`Huber Loss in Wikipedia`: https://en.wikipedia.org/wiki/Huber_loss

.. _kl_divergence:

Expand Down

0 comments on commit 31739f2

Please sign in to comment.