Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.

Already on GitHub? Sign in to your account

Operator is not supported in mobile module. #68987

Open
nnick14 opened this issue Nov 29, 2021 · 2 comments
Open

Operator is not supported in mobile module. #68987

nnick14 opened this issue Nov 29, 2021 · 2 comments
Labels
oncall: mobile Related to mobile support, including iOS and Android

Comments

@nnick14
Copy link

nnick14 commented Nov 29, 2021

馃悰 Bug

When converting a TorchScript model to mobile, if the model contains unsupported mobile ops, a vague error is returned. It'd be nice if there was a more clear description of the exact op code error.

To Reproduce

Steps to reproduce the behavior:

Run the following code with an optimized_scripted_module containing a non-supported mobile operator.

optimized_scripted_module._save_for_lite_interpreter("<output_name>.ptl")

Expected behavior

The RuntimeError is not exactly clear. It would be helpful to either include a link to the FORALL_OPCODES or return the specific OPCODE information. Maybe something like:

RuntimeError: ENTER (enter scope of a contextmanager) is not supported in mobile module.

Instead of what's currently returned:

RuntimeError: ENTER is not supported in mobile module.

Environment

  • PyTorch version: 1.9.1+cu111
  • OS: Ubuntu 18.04.6 LTS (x86_64)
  • Python version: 3.8.0, pip
  • CUDA: 11.1
@soulitzer soulitzer added the oncall: mobile Related to mobile support, including iOS and Android label Nov 30, 2021
@zabir-nabil
Copy link

Same issue with pytorch torch-1.11.0.dev20220108+cpu (nightly) on windows.

RecursiveScriptModule(original_name=MainModel)
Traceback (most recent call last):
  File "melspec_test.py", line 90, in <module>
    optimized_model._save_for_lite_interpreter("vgg_sre.ptl")
  File "C:\ProgramData\Anaconda3\envs\py38\lib\site-packages\torch\jit\_script.py", line 707, in _save_for_lite_interpreter
    return self._c._save_for_mobile(*args, **kwargs)
RuntimeError: ENTER is not supported in mobile module.

@VasenkovArtem
Copy link

hi
i don't know if this issue is still actual for you but it can be helpful for others who has the same problem
i got this message
Traceback (most recent call last): File "C:/Users/袗褉褌褢屑/Downloads/image_captioning_translate/checking_new.py", line 67, in <module> optimized_scripted_module_full._save_for_lite_interpreter("full_model_29-05.ptl") File "C:\Program Files\Python39\lib\site-packages\torch\jit\_script.py", line 707, in _save_for_lite_interpreter return self._c._save_for_mobile(*args, **kwargs) RuntimeError: ENTER is not supported in mobile module.
solution was to delete strings with torch.no_grad(): so i moved the code from these blocks to the main block:
trg_tensor = torch.tensor(trg_indexes).unsqueeze(0).to(self.device)
trg_mask = self.translater.make_trg_mask(trg_tensor)
with torch.no_grad():
output = self.translater.decoder(trg_tensor, enc_src, trg_mask, src_mask)
pred_token = output.argmax(2)[:,-1].item()
trg_indexes.append(pred_token)
became
trg_tensor = torch.tensor(trg_indexes).unsqueeze(0).to(self.device)
trg_mask = self.translater.make_trg_mask(trg_tensor)
output = self.translater.decoder(trg_tensor, enc_src, trg_mask, src_mask)
pred_token = output.argmax(2)[:,-1].item()
trg_indexes.append(pred_token)
this action helped me not to get such problem again
sorry for my english )

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
oncall: mobile Related to mobile support, including iOS and Android
Projects
None yet
Development

No branches or pull requests

4 participants