Skip to content

Commit 96b67e1

Browse files
authored
Merge pull request #24 from boazcogan/fix/issue_17
Fix/issue 17
2 parents 1eac09a + 87bf61a commit 96b67e1

File tree

1 file changed

+3
-2
lines changed

1 file changed

+3
-2
lines changed

book/approaches/deep/building_blocks.md

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -508,7 +508,8 @@ A plot of the PReLU activation function.
508508
Two related activation functions to ReLUs are the `Leaky ReLU` and `PReLU`.
509509

510510
The Leaky ReLU is similar to the regular ReLU, but instead of
511-
the output being $0.0$ below $x=0.0$ it is ever so slightly above $0.0$:
511+
the output being $0.0$ below $x=0.0$ it is weighted ever so slightly above $0.0$
512+
so that small updates will still be made:
512513

513514
![leaky_relu_eq](../../images/deep_approaches/leaky_relu_eq.svg)
514515

@@ -794,4 +795,4 @@ to make whole networks. We'll see how that's done on the next page.
794795

795796

796797
[^fn1]: Although we won't cover Wavenet in detail in this tutorial, it has been
797-
used for music source separation in this paper: {cite}`lluis2019end`.
798+
used for music source separation in this paper: {cite}`lluis2019end`.

0 commit comments

Comments
 (0)