diff --git a/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb b/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb index 4b0c645ace..38ffc232e0 100644 --- a/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb +++ b/intro-to-pytorch/Part 2 - Neural Networks in PyTorch (Exercises).ipynb @@ -333,7 +333,7 @@ "source": [ "### Activation functions\n", "\n", - "So far we've only been looking at the softmax activation, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n", + "So far we've only been looking at the sigmoid activation function, but in general any function can be used as an activation function. The only requirement is that for a network to approximate a non-linear function, the activation functions must be non-linear. Here are a few more examples of common activation functions: Tanh (hyperbolic tangent), and ReLU (rectified linear unit).\n", "\n", "\n", "\n",