Skip to content

Commit

Permalink
readthedocs: add weighted loss
Browse files Browse the repository at this point in the history
  • Loading branch information
Evizero committed Feb 9, 2017
1 parent 8fa6423 commit 3e15869
Showing 1 changed file with 82 additions and 0 deletions.
82 changes: 82 additions & 0 deletions docs/losses/scaledandweighted.rst
Original file line number Diff line number Diff line change
Expand Up @@ -102,3 +102,85 @@ This is accomplished using the power of ``@fastmath``.
Reweighting a Margin Loss
----------------------------

It is not uncommon in classification scenarios to find yourself
working with in-balanced data sets, where one class has much more
observations than the other one. There are different strategies
to deal with this kind of problem. The approach that this package
provides is to weight the loss for the classes differently. This
basically means that we penalize mistakes in one class more than
mistakes in the other class. More specifically we scale the loss
of the positive class by the weight-factor :math:`w` and the loss
of the negative class with :math:`1-w`.

.. code-block:: julia
if target > 0
w * loss(target, output)
else
(1-w) * loss(target, output)
end
Instead of providing special functions to compute a
class-weighted loss, we instead expose a generic way to create
new weighted versions of already existing unweighted losses. This
way, every existing subtype of :class:`MarginLoss` can be
re-weighted arbitrarily. Furthermore, it allows every algorithm
that expects a binary loss to work with weighted binary losses as
well.

.. code-block:: jlcon
julia> myloss = weightedloss(HingeLoss(), 0.8)
LossFunctions.WeightedBinaryLoss{LossFunctions.L1HingeLoss,0.8}(LossFunctions.L1HingeLoss())
# positive class
julia> value(myloss, 1.0, -4.0)
4.0
julia> value(HingeLoss(), 1.0, -4.0)
5.0
# negative class
julia> value(myloss, -1.0, 4.0)
1.0
julia> value(HingeLoss(), -1.0, 4.0)
5.0
Note that the scaled version of a margin-based loss does not
anymore belong to the family of margin-based losses itself. In
other words the resulting loss is neither a subtype of
:class:`MarginLoss`, nor of the original type of loss.

.. code-block:: jlcon
julia> typeof(myloss) <: MarginLoss
false
julia> typeof(myloss) <: HingeLoss
false
Similar to scaled losses, the constant weight factor gets
promoted to a type-parameter. This can be quite an overhead when
done on the fly every time the loss value is computed. To avoid
this one can make use of ``Val`` to specify the scale factor in a
type-stable manner.

.. code-block:: jlcon
julia> myloss = weightedloss(HingeLoss(), Val{0.8})
LossFunctions.WeightedBinaryLoss{LossFunctions.L1HingeLoss,0.8}(LossFunctions.L1HingeLoss())
Storing the scale factor as a type-parameter instead of a member
variable has a nice advantage. It makes it possible to define new
types of losses using simple type-aliases.

.. code-block:: jlcon
julia> typealias MyWeightedHingeLoss LossFunctions.WeightedBinaryLoss{HingeLoss,0.8}
LossFunctions.WeightedBinaryLoss{LossFunctions.L1HingeLoss,0.8}
julia> value(MyWeightedHingeLoss(), 1.0, -4.0)
4.0

0 comments on commit 3e15869

Please sign in to comment.