Skip to content

Commit

Permalink
Correct a typo (#61)
Browse files Browse the repository at this point in the history
in the formula for the derivative of cost w.r.t. the output layer weight (Line 145)
  • Loading branch information
lucy-kim authored and bfortuner committed Oct 12, 2019
1 parent 29be6b9 commit bd6362f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion docs/backpropagation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -142,7 +142,7 @@ Let’s return to our formula for the derivative of cost with respect to the out
C'(W_O) = (\hat{y} - y) \cdot R'(Z_O) \cdot H
We know we can replace the first part with our equation for output layer error :math:`E_h`. H represents the hidden layer activation.
We know we can replace the first part with our equation for output layer error :math:`E_o`. H represents the hidden layer activation.

.. math::
Expand Down

0 comments on commit bd6362f

Please sign in to comment.