From fec314c6a4f2616d34486d8ab8857124fcaf3d42 Mon Sep 17 00:00:00 2001 From: Nikunj Taneja <32366458+underscoreorcus@users.noreply.github.com> Date: Mon, 24 Dec 2018 14:44:34 +0530 Subject: [PATCH] Fixed a typo The explanation line under Activation Functions read 'softmax' instead of 'sigmoid' --- .../Part 2 - Neural Networks in PyTorch (Exercises).ipynb | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb b/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb index 4b0c645ace..38ffc232e0 100644 --- a/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb +++ b/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb @@ -333,7 +333,7 @@ "source": [ "### Activation functions\n", "\n", - "So far we've only been looking at the softmax activation, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n", + "So far we've only been looking at the sigmoid activation function, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n", "\n", "\n", "\n",