Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update the docs of torch.eig about derivative #47598

Closed
wants to merge 1 commit into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
4 changes: 2 additions & 2 deletions torch/_torch_docs.py
Expand Up @@ -2640,7 +2640,7 @@ def merge_dicts(*dicts):
differently than dot(a, b). If the first argument is complex, the complex conjugate of the
first argument is used for the calculation of the dot product.

.. note::
.. note::

Unlike NumPy's vdot, torch.vdot intentionally only supports computing the dot product
of two 1D tensors with the same number of elements.
Expand Down Expand Up @@ -2672,7 +2672,7 @@ def merge_dicts(*dicts):

.. note::
Since eigenvalues and eigenvectors might be complex, backward pass is supported only
for :func:`torch.symeig`
if eigenvalues and eigenvectors are all real valued.

Args:
input (Tensor): the square matrix of shape :math:`(n \times n)` for which the eigenvalues and eigenvectors
Expand Down