You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm using version 0.4.0 of tensorly-torch and am running into an issue with non-contiguous inputs to a FactorizedEmbedding layer raising a runtime error. Here's a minimum working example:
Traceback (most recent call last):
File "tltorch_bug_mwe.py", line 6, in <module>
embedding(data.T)
File "/Users/jemis/opt/miniconda3/envs/tensorized3.8/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1194, in _call_impl
return forward_call(*input, **kwargs)
File "/Users/jemis/opt/miniconda3/envs/tensorized3.8/lib/python3.8/site-packages/tltorch/factorized_layers/factorized_embedding.py", line 100, in forward
flatenned_input = input.view(-1)
RuntimeError: view size is not compatible with input tensor's size and stride (at least one dimension spans across two contiguous subspaces). Use .reshape(...) instead.
Replacing input.view(-1) with input.reshape(-1) does indeed resolve this.
The text was updated successfully, but these errors were encountered:
I'm using version 0.4.0 of tensorly-torch and am running into an issue with non-contiguous inputs to a
FactorizedEmbedding
layer raising a runtime error. Here's a minimum working example:Running this yields:
Replacing
input.view(-1)
withinput.reshape(-1)
does indeed resolve this.The text was updated successfully, but these errors were encountered: