You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Florian1990 opened this issue
Oct 17, 2018
· 2 comments
Labels
featureA request for a proper, new feature.todoNot as important as medium or high priority tasks, but we will work on these.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
I realized that PyTorch does partly prevent operator overloading by raising a TypeError rather than returning NotImplemented. Therefore Python never checks object methods implementing operations with reflected operands, e. g. __rmul__:
In[1]: import torch
In[2]: class Two:
def __mul__(self, other):
return other * 2
def __rmul__(self, other):
return self * other
In[3]: two = Two()
In[4]: two * 3
Out[4]: 6
In[5]: 3 * two
Out[5]: 6
In[6]: two * torch.tensor(3)
Out[6]: tensor(6)
In[7]: torch.tensor(3) * two
Traceback (most recent call last):
File "/path/lib/python3.6/site-packages/IPython/core/interactiveshell.py", line 2961, in run_code
exec(code_obj, self.user_global_ns, self.user_ns)
File "<ipython-input-8-e59deefe7a13>", line 1, in <module>
torch.tensor(3) * two
TypeError: mul() received an invalid combination of arguments - got (Two), but expected one of:
* (Tensor other)
didn't match because some of the arguments have invalid types: (!Two!)
* (float other)
didn't match because some of the arguments have invalid types: (!Two!)
Is this behavior intended (e. g. for performance reasons)? I would prefer PyTorch returning NotImplemented (which might result in raising a TypeError) rather than directly raising a TypeError.
The text was updated successfully, but these errors were encountered:
Hi,
I saw this thread and was just wondering whether this had been implemented?
Thank you for building such an awesome machine learning framework!
ailzhang
added
feature
A request for a proper, new feature.
triaged
This issue has been looked at a team member, and triaged and prioritized into an appropriate module
labels
Mar 30, 2020
featureA request for a proper, new feature.todoNot as important as medium or high priority tasks, but we will work on these.triagedThis issue has been looked at a team member, and triaged and prioritized into an appropriate module
I realized that PyTorch does partly prevent operator overloading by raising a
TypeError
rather than returningNotImplemented
. Therefore Python never checks object methods implementing operations with reflected operands, e. g.__rmul__
:Is this behavior intended (e. g. for performance reasons)? I would prefer PyTorch returning
NotImplemented
(which might result in raising aTypeError
) rather than directly raising aTypeError
.The text was updated successfully, but these errors were encountered: