Skip to content

Commit

Permalink
Corrected typo in predictions (#29)
Browse files Browse the repository at this point in the history
  • Loading branch information
bkowshik authored and bfortuner committed Jul 23, 2018
1 parent dbef02e commit d49e88f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/loss_functions.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,7 @@ Cross-entropy loss, or log loss, measures the performance of a classification mo
.. image:: images/cross_entropy.png
:align: center

The graph above shows the range of possible loss values given a true observation (isDog = 1). As the predicted probability approaches 1, log loss slowly decreases. As the predicted probability decreases, however, the log loss increases rapidly. Log loss penalizes both types of errors, but especially those predications that are confident and wrong!
The graph above shows the range of possible loss values given a true observation (isDog = 1). As the predicted probability approaches 1, log loss slowly decreases. As the predicted probability decreases, however, the log loss increases rapidly. Log loss penalizes both types of errors, but especially those predictions that are confident and wrong!

.. note::

Expand Down

0 comments on commit d49e88f

Please sign in to comment.