From a49367e9c9fcfa9547782420a24441f13fef19dc Mon Sep 17 00:00:00 2001 From: mfkasim91 Date: Mon, 9 Nov 2020 13:16:51 -0800 Subject: [PATCH] Update the docs of torch.eig about derivative (#47598) Summary: Related: https://github.com/pytorch/pytorch/issues/33090 I just realized that I haven't updated the docs of `torch.eig` when implementing the backward. Here's the PR updating the docs about the grad of `torch.eig`. cc albanD Pull Request resolved: https://github.com/pytorch/pytorch/pull/47598 Reviewed By: heitorschueroff Differential Revision: D24829373 Pulled By: albanD fbshipit-source-id: 89963ce66b2933e6c34e2efc93ad0f2c3dd28c68 --- torch/_torch_docs.py | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/torch/_torch_docs.py b/torch/_torch_docs.py index d4b80f38f299..5908c151812d 100644 --- a/torch/_torch_docs.py +++ b/torch/_torch_docs.py @@ -2640,7 +2640,7 @@ def merge_dicts(*dicts): differently than dot(a, b). If the first argument is complex, the complex conjugate of the first argument is used for the calculation of the dot product. -.. note:: +.. note:: Unlike NumPy's vdot, torch.vdot intentionally only supports computing the dot product of two 1D tensors with the same number of elements. @@ -2672,7 +2672,7 @@ def merge_dicts(*dicts): .. note:: Since eigenvalues and eigenvectors might be complex, backward pass is supported only - for :func:`torch.symeig` + if eigenvalues and eigenvectors are all real valued. Args: input (Tensor): the square matrix of shape :math:`(n \times n)` for which the eigenvalues and eigenvectors