Skip to content

Conversation

@8bitmp3
Copy link
Collaborator

@8bitmp3 8bitmp3 commented May 22, 2021

A few small suggestions to improve the JAX for the Impatient doc:

  • Provide a few minor fixes (grammar, syntax, spelling, capitalization (e.g. JVP) etc).
  • At the beginning of the Gradients and autodiff section:
    • Spell out the JAX "docs" - the Autodiff Cookbook - for better UX.
    • Add "automatic differentiation system".

"For a full overview of JAX's automatic differentiation system, you can check the Autodiff Cookbook."

  • Define VJPs and JVPs (Vector-Jacobian/Jacobian-Vector products) at the first mention.
    • Refer to them as reverse/forward-mode autodiff for less experienced users who haven't read the Autodiff Cookbook.
    • Provide specific links to VJP and JVP sections in the Cookbook, so the users don't have to scroll down to find them.
    • Capitalize "JVP", "VJP", "Jacobian-Vector" (as per the JAX glossary: https://jax.readthedocs.io/en/latest/glossary.html?highlight=JVP#term-JVP)

Even though, theoretically, a VJP (Vector-Jacobian product - reverse autodiff) and a JVP (Jacobian-Vector product - forward-mode autodiff) are similar—they compute a product of a Jacobian and a vector—they differ by the computational complexity of the operation. In short, when you have a large number of parameters (hence a wide matrix), a JVP is less efficient computationally than a VJP, and conversely JVP is more efficient when the Jacobian matrix is a tall matrix. You can read more in the JAX cookbook notebook mentioned above."

LMKWYT, thanks! @avital @marcvanzee @levskaya

@review-notebook-app
Copy link

Check out this pull request on  ReviewNB

See visual diffs & provide feedback on Jupyter Notebooks.


Powered by ReviewNB

@google-cla google-cla bot added the cla: yes label May 22, 2021
@8bitmp3
Copy link
Collaborator Author

8bitmp3 commented May 22, 2021

The second warning in the pytest refers to the deprecated module, doesn't it?

WARNING: [autosummary] failed to import 'flax.linen.dot_product_attention_weights': no module named flax.linen.dot_product_attention_weights
11
WARNING: [autosummary] failed to import 'flax.optim.AdaFactor': no module named flax.optim.AdaFactor
12

@marcvanzee marcvanzee self-requested a review May 23, 2021 06:30
@marcvanzee marcvanzee merged commit a40a3c3 into google:master May 23, 2021
@8bitmp3 8bitmp3 deleted the patch-1 branch May 23, 2021 14:32
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants