You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
When converting a TorchScript model to mobile, if the model contains unsupported mobile ops, a vague error is returned. It'd be nice if there was a more clear description of the exact op code error.
To Reproduce
Steps to reproduce the behavior:
Run the following code with an optimized_scripted_module containing a non-supported mobile operator.
The RuntimeError is not exactly clear. It would be helpful to either include a link to the FORALL_OPCODES or return the specific OPCODE information. Maybe something like:
RuntimeError: ENTER (enter scope of a contextmanager) is not supported in mobile module.
Instead of what's currently returned:
RuntimeError: ENTER is not supported in mobile module.
Environment
PyTorch version: 1.9.1+cu111
OS: Ubuntu 18.04.6 LTS (x86_64)
Python version: 3.8.0, pip
CUDA: 11.1
The text was updated successfully, but these errors were encountered:
Same issue with pytorch torch-1.11.0.dev20220108+cpu (nightly) on windows.
RecursiveScriptModule(original_name=MainModel)
Traceback (most recent call last):
File "melspec_test.py", line 90, in <module>
optimized_model._save_for_lite_interpreter("vgg_sre.ptl")
File "C:\ProgramData\Anaconda3\envs\py38\lib\site-packages\torch\jit\_script.py", line 707, in _save_for_lite_interpreter
return self._c._save_for_mobile(*args, **kwargs)
RuntimeError: ENTER is not supported in mobile module.
hi
i don't know if this issue is still actual for you but it can be helpful for others who has the same problem
i got this message Traceback (most recent call last): File "C:/Users/袗褉褌褢屑/Downloads/image_captioning_translate/checking_new.py", line 67, in <module> optimized_scripted_module_full._save_for_lite_interpreter("full_model_29-05.ptl") File "C:\Program Files\Python39\lib\site-packages\torch\jit\_script.py", line 707, in _save_for_lite_interpreter return self._c._save_for_mobile(*args, **kwargs) RuntimeError: ENTER is not supported in mobile module.
solution was to delete strings with torch.no_grad(): so i moved the code from these blocks to the main block: trg_tensor = torch.tensor(trg_indexes).unsqueeze(0).to(self.device) trg_mask = self.translater.make_trg_mask(trg_tensor) with torch.no_grad(): output = self.translater.decoder(trg_tensor, enc_src, trg_mask, src_mask) pred_token = output.argmax(2)[:,-1].item() trg_indexes.append(pred_token)
became trg_tensor = torch.tensor(trg_indexes).unsqueeze(0).to(self.device) trg_mask = self.translater.make_trg_mask(trg_tensor) output = self.translater.decoder(trg_tensor, enc_src, trg_mask, src_mask) pred_token = output.argmax(2)[:,-1].item() trg_indexes.append(pred_token)
this action helped me not to get such problem again
sorry for my english )
馃悰 Bug
When converting a TorchScript model to mobile, if the model contains unsupported mobile ops, a vague error is returned. It'd be nice if there was a more clear description of the exact op code error.
To Reproduce
Steps to reproduce the behavior:
Run the following code with an optimized_scripted_module containing a non-supported mobile operator.
Expected behavior
The RuntimeError is not exactly clear. It would be helpful to either include a link to the FORALL_OPCODES or return the specific OPCODE information. Maybe something like:
Instead of what's currently returned:
Environment
The text was updated successfully, but these errors were encountered: