We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
1 parent aa8aefe commit 720c9c6Copy full SHA for 720c9c6
Python/Module3_IntroducingNumpy/AutoDiff.md
@@ -335,7 +335,7 @@ It must be noted that this approach towards finding $x_\mathrm{min}$ is highly l
335
Let's take a simple example.
336
We'll choose the function $f(x) = (x-8)^2$ and the starting point $x=-1.5$.
337
As we search for $x_\mathrm{min}$ we don't want to make our updates to $x_o$
338
-too big, so we will scale our updates by a factor of $3/10$ (which is somewhat haphazardly here).
+too big, so we will scale our updates by a factor of $3/10$ (the value of which is chosen somewhat haphazardly here).
339
340
```python
341
# Performing gradient descent on f(x) = (x - 8) ** 2
0 commit comments