Fix inconsistencies in polynomial autograd tutorial#3899
Conversation
- Use sine instead of exponential to match the tutorial text. - Update the learning rate and iteration count to match `beginner_source/examples_tensor/polynomial_tensor.py`. - Correct the description of the `x.grad` attribute. - Remove redundant lines of code.
🔗 Helpful Links🧪 See artifacts and rendered test results at hud.pytorch.org/pr/pytorch/tutorials/3899
Note: Links to docs will display an error until the docs builds have been completed. ❗ 1 Active SEVsThere are 1 currently active SEVs. If your PR is affected, please view them below: This comment was automatically generated by Dr. CI and updates every 15 minutes. |
|
Dueling fixes (full disclosure I'm the author of #3897)! @gtsitsik I don't know if you saw #3597, the PR that originally (and fairly recently) changed the tutorial from using sin(x) to e^x, but they gave a pretty reason for doing so. Basically, the trained Taylor expansion given in the tutorial is able to converge nicely to e^x, but does not converge that nicely to sin(x). You can see the problem in the following plots:
The problem is fundamentally a mathematical one; a third order polynomial is never a good fit for sin(x), while at least over the domain [-1, 1] a third order polynomial is a good fit for e^x. Therefore I propose that Pytorch keeps the tutorial using e^x while also keeping @gtsitsik's fixes of typos and removal of unused code. I've updated #3897 with these fixes. |
|
@telamonian by the way I noticed your PR after I created mine. Regarding using an exponential to get a better approximation, I don't think it really matters for the purposes of this tutorial. But I also don't mind if it remains an exponential. However, I do think that the 3 tutorials should be consistent with each other as they seem to be aiming to demonstrate the same thing, but via different approches. That is,
All 3 should either use a sine or an exponential because otherwise they will be inconsistent with each other and their pedagogical quality becomes slightly off. |


Changes
beginner_source/examples_tensor/polynomial_tensor.pywhich is exactly the same tutorial but with manual gradient computations.x.gradattribute.Note
PR #3663 and PR #3897 recommended updating the tutorial text to show the exponential function instead of updating the code to use a sine. However, I recommend using sine everywhere since both the numpy and non-autograd pytorch polynomial tutorials use a sine.
Fixes #3632
Checklist