File tree Expand file tree Collapse file tree 1 file changed +3
-2
lines changed Expand file tree Collapse file tree 1 file changed +3
-2
lines changed Original file line number Diff line number Diff line change @@ -508,7 +508,8 @@ A plot of the PReLU activation function.
508
508
Two related activation functions to ReLUs are the ` Leaky ReLU ` and ` PReLU ` .
509
509
510
510
The Leaky ReLU is similar to the regular ReLU, but instead of
511
- the output being $0.0$ below $x=0.0$ it is ever so slightly above $0.0$:
511
+ the output being $0.0$ below $x=0.0$ it is weighted ever so slightly above $0.0$
512
+ so that small updates will still be made:
512
513
513
514
![ leaky_relu_eq] ( ../../images/deep_approaches/leaky_relu_eq.svg )
514
515
@@ -794,4 +795,4 @@ to make whole networks. We'll see how that's done on the next page.
794
795
795
796
796
797
[ ^ fn1 ] : Although we won't cover Wavenet in detail in this tutorial, it has been
797
- used for music source separation in this paper: {cite}` lluis2019end ` .
798
+ used for music source separation in this paper: {cite}` lluis2019end ` .
You can’t perform that action at this time.
0 commit comments