Skip to content

Commit

Permalink
transition margin and distance docs
Browse files Browse the repository at this point in the history
  • Loading branch information
Evizero committed Aug 29, 2018
1 parent 49da958 commit 9217691
Show file tree
Hide file tree
Showing 9 changed files with 486 additions and 41 deletions.
11 changes: 8 additions & 3 deletions docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -14,11 +14,16 @@ makedocs(
pages = Any[
"Home" => "index.md",
"Introduction" => [
"Getting Started" => "introduction/gettingstarted.md",
"Background and Motivation" => "introduction/motivation.md",
"introduction/gettingstarted.md",
"introduction/motivation.md",
],
"User's Guide" => [
"Working with Losses" => "user/interface.md",
"user/interface.md",
"user/aggregate.md",
],
"Available Losses" => [
"losses/distance.md",
"losses/margin.md",
],
hide("Indices" => "indices.md"),
"LICENSE.md",
Expand Down
20 changes: 20 additions & 0 deletions docs/src/assets/style.css
Original file line number Diff line number Diff line change
Expand Up @@ -25,3 +25,23 @@ article p {
article > header .edit-page {
margin-left: 1em;
}

/*
* Hide the ASCII plots and formulas of the docstrings
* on the margin-based and distance-based pages.
*/
.loss-docs .docstring pre:not(:nth-child(1)) {
display: none;
}
.loss-docs .docstring > div > div {
display: none;
}
.loss-docs .docstring hr {
display: none;
}
article .loss-docs table {
width: 100%;
}
article .loss-docs table td {
width: 50%;
}
48 changes: 46 additions & 2 deletions docs/src/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -54,12 +54,21 @@ functions and their available methods. We will start by
describing how to instantiate a loss, as well as the basic
interface that all loss functions share.

```@contents
Pages = ["user/interface.md"]
Depth = 2
```

Next we will consider how to average or sum the results of the
loss functions more efficiently. The methods described here are
implemented in such a way as to avoid allocating a temporary
array.

```@contents
Pages = ["user/aggregate.md"]
Depth = 2
```

## Available Loss Functions

Aside from the interface, this package also provides a number of
Expand All @@ -77,7 +86,25 @@ primarily used in regression problems. They utilize the numeric
difference between the predicted output and the true target as a
proxy variable to quantify the quality of individual predictions.

![](https://rawgithub.com/JuliaML/FileStorage/master/LossFunctions/distance.svg)

```@raw html
<table><tbody><tr><td style="text-align: left;">
```

```@contents
Pages = ["losses/distance.md"]
Depth = 2
```

```@raw html
</td><td>
```

![distance-based losses](https://rawgithub.com/JuliaML/FileStorage/master/LossFunctions/distance.svg)

```@raw html
</td></tr></tbody></table>
```

### Loss Functions for Classification

Expand All @@ -87,7 +114,24 @@ do not care about the difference between true target and
prediction. Instead they penalize predictions based on how well
they agree with the sign of the target.

![](https://rawgithub.com/JuliaML/FileStorage/master/LossFunctions/margin.svg)
```@raw html
<table><tbody><tr><td style="text-align: left;">
```

```@contents
Pages = ["losses/margin.md"]
Depth = 2
```

```@raw html
</td><td>
```

![margin-based losses](https://rawgithub.com/JuliaML/FileStorage/master/LossFunctions/margin.svg)

```@raw html
</td></tr></tbody></table>
```

## Common Meta Losses

Expand Down
141 changes: 141 additions & 0 deletions docs/src/losses/distance.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,141 @@
```@meta
DocTestSetup = quote
using LossFunctions
end
```
```@raw html
<div class="loss-docs">
```

# Distance-based Losses

Loss functions that belong to the category "distance-based" are
primarily used in regression problems. They utilize the numeric
difference between the predicted output and the true target as a
proxy variable to quantify the quality of individual predictions.

This section lists all the subtypes of [`DistanceLoss`](@ref)
that are implemented in this package.


## LPDistLoss

```@docs
LPDistLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/LPDistLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/LPDistLoss2.svg)
``L(r) = \mid r \mid ^p`` | ``L'(r) = p \cdot r \cdot \mid r \mid ^{p-2}``


## L1DistLoss

```@docs
L1DistLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L1DistLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L1DistLoss2.svg)
``L(r) = \mid r \mid`` | ``L'(r) = \textrm{sign}(r)``


## L2DistLoss

```@docs
L2DistLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L2DistLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L2DistLoss2.svg)
``L(r) = \mid r \mid ^2`` | ``L'(r) = 2 r``


## LogitDistLoss

```@docs
LogitDistLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/LogitDistLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/LogitDistLoss2.svg)
``L(r) = - \ln \frac{4 e^r}{(1 + e^r)^2}`` | ``L'(r) = \tanh \left( \frac{r}{2} \right)``


## HuberLoss

```@docs
HuberLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/HuberLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/HuberLoss2.svg)
``L(r) = \begin{cases} \frac{r^2}{2} & \quad \text{if } \mid r \mid \le \alpha \\ \alpha \mid r \mid - \frac{\alpha^2}{2} & \quad \text{otherwise}\\ \end{cases}`` | ``L'(r) = \begin{cases} r & \quad \text{if } \mid r \mid \le \alpha \\ \alpha \cdot \textrm{sign}(r) & \quad \text{otherwise}\\ \end{cases}``


## L1EpsilonInsLoss

```@docs
L1EpsilonInsLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L1EpsilonInsLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L1EpsilonInsLoss2.svg)
``L(r) = \max \{ 0, \mid r \mid - \epsilon \}`` | ``L'(r) = \begin{cases} \frac{r}{ \mid r \mid } & \quad \text{if } \epsilon \le \mid r \mid \\ 0 & \quad \text{otherwise}\\ \end{cases}``


## L2EpsilonInsLoss

```@docs
L2EpsilonInsLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L2EpsilonInsLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/L2EpsilonInsLoss2.svg)
``L(r) = \max \{ 0, \mid r \mid - \epsilon \}^2`` | ``L'(r) = \begin{cases} 2 \cdot \textrm{sign}(r) \cdot \left( \mid r \mid - \epsilon \right) & \quad \text{if } \epsilon \le \mid r \mid \\ 0 & \quad \text{otherwise}\\ \end{cases}``


## PeriodicLoss

```@docs
PeriodicLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/PeriodicLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/PeriodicLoss2.svg)
``L(r) = 1 - \cos \left ( \frac{2 r \pi}{c} \right )`` | ``L'(r) = \frac{2 \pi}{c} \cdot \sin \left( \frac{2r \pi}{c} \right)``


## QuantileLoss

```@docs
QuantileLoss
```

Lossfunction | Derivative
-------------|------------
![loss](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/QuantileLoss1.svg) | ![deriv](https://rawgit.com/JuliaML/FileStorage/master/LossFunctions/QuantileLoss2.svg)
``L(r) = \begin{cases} \left( 1 - \tau \right) r & \quad \text{if } r \ge 0 \\ - \tau r & \quad \text{otherwise} \\ \end{cases}`` | ``L(r) = \begin{cases} 1 - \tau & \quad \text{if } r \ge 0 \\ - \tau & \quad \text{otherwise} \\ \end{cases}``

!!! note

You may note that our definition of the QuantileLoss looks
different to what one usually sees in other literature. The
reason is that we have to correct for the fact that in our
case ``r = \hat{y} - y`` instead of
``r_{\textrm{usual}} = y - \hat{y}``, which means that
our definition relates to that in the manner of
``r = -1 * r_{\textrm{usual}}``.


```@raw html
</div>
```

0 comments on commit 9217691

Please sign in to comment.