Skip to content

Commit

Permalink
Update the docs of torch.eig about derivative (#47598)
Browse files Browse the repository at this point in the history
Summary:
Related: #33090
I just realized that I haven't updated the docs of `torch.eig` when implementing the backward.
Here's the PR updating the docs about the grad of `torch.eig`.

cc albanD

Pull Request resolved: #47598

Reviewed By: heitorschueroff

Differential Revision: D24829373

Pulled By: albanD

fbshipit-source-id: 89963ce66b2933e6c34e2efc93ad0f2c3dd28c68
  • Loading branch information
mfkasim1 authored and facebook-github-bot committed Nov 9, 2020
1 parent 4159191 commit a49367e
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions torch/_torch_docs.py
Expand Up @@ -2640,7 +2640,7 @@ def merge_dicts(*dicts):
differently than dot(a, b). If the first argument is complex, the complex conjugate of the
first argument is used for the calculation of the dot product.
.. note::
.. note::
Unlike NumPy's vdot, torch.vdot intentionally only supports computing the dot product
of two 1D tensors with the same number of elements.
Expand Down Expand Up @@ -2672,7 +2672,7 @@ def merge_dicts(*dicts):
.. note::
Since eigenvalues and eigenvectors might be complex, backward pass is supported only
for :func:`torch.symeig`
if eigenvalues and eigenvectors are all real valued.
Args:
input (Tensor): the square matrix of shape :math:`(n \times n)` for which the eigenvalues and eigenvectors
Expand Down

0 comments on commit a49367e

Please sign in to comment.