File tree 2 files changed +8
-2
lines changed
2 files changed +8
-2
lines changed Original file line number Diff line number Diff line change @@ -28,4 +28,7 @@ Advances in optimizing Recurrent Networks by Yoshua Bengio, Section 3.5
28
28
http://arxiv.org/pdf/1212.0901v2.pdf
29
29
30
30
Dropout: A Simple Way to Prevent Neural Networks from Overfitting
31
- https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
31
+ https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
32
+
33
+ The Loss Surfaces of Multilayer Networks
34
+ https://arxiv.org/pdf/1412.0233.pdf
Original file line number Diff line number Diff line change @@ -24,4 +24,7 @@ Practical Deep Reinforcement Learning Approach for Stock Trading
24
24
https://arxiv.org/abs/1811.07522
25
25
26
26
Inceptionism: Going Deeper into Neural Networks
27
- https://ai.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html
27
+ https://ai.googleblog.com/2015/06/inceptionism-going-deeper-into-neural.html
28
+
29
+ The Loss Surfaces of Multilayer Networks
30
+ https://arxiv.org/pdf/1412.0233.pdf
You can’t perform that action at this time.
0 commit comments