Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

PyTorch 1.6.0 released today - breaks coremltools converter v4.0b2 in many cases with AttributeError: module 'torch._C' has no attribute '_jit_pass_canonicalize_ops' #827

Closed
leovinus2001 opened this issue Jul 29, 2020 · 7 comments
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (traced)

Comments

@leovinus2001
Copy link
Contributor

I just noticed that PyTorch 1.6.0 was released. Now, many test cases from the last 2 weeks break with the log below and I believe a similar issue was reported earlier in issue #751 for pytorch nightly

Reproducible::--------------------------

yes, with torch =1.6.0 and torchvision = 0.7.0

Example test cases that fail now are #823 #825 or
#736 (comment)

Log:--------------------------

Traceback (most recent call last):
File "testAssert.py", line 55, in
inputs= [ ct.TensorType(name="input1", shape=dummy_input.shape) ]
File "/Library/Python/3.7/lib/python/site-packages/coremltools/converters/_converters_entry.py", line 299, in convert
**kwargs
File "
/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 120, in _convert
prog = frontend_converter(model, **kwargs)
File "/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 62, in call
return load(*args, **kwargs)
File "
/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 73, in load
converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols)
File "/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 141, in init
raw_graph, params_dict = self._expand_and_optimize_ir(self.torchscript)
File "
/Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 273, in _expand_and_optimize_ir
_torch._C._jit_pass_canonicalize_ops(graph)
AttributeError: module 'torch._C' has no attribute '_jit_pass_canonicalize_ops'

@leovinus2001 leovinus2001 added the bug Unexpected behaviour that should be corrected (type) label Jul 29, 2020
@leovinus2001
Copy link
Contributor Author

leovinus2001 commented Jul 29, 2020

While I do not know how to fix it properly, this is what I can see in case that is helpful for someone else.

  1. coremltools/converters/mil/frontend/torch/converter.py
    The converter bails on the line 273
    _torch._C._jit_pass_canonicalize_ops(graph)

There are two alternatives without the _ops:
_torch._C._jit_pass_canonicalize(graph)
or
_torch._C._jit_pass_canonicalize_graph_fuser_ops(graph)

Neither one has lasting success as we then end up with (sometimes, not always)

  1. in
    inputs=[ ct.TensorType(name="input1", shape=dummy_input.shape) ],
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/_converters_entry.py", line 299, in convert
    **kwargs
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 120, in _convert
    prog = frontend_converter(model, **kwargs)
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/converter.py", line 62, in call
    return load(*args, **kwargs)
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/load.py", line 73, in load
    converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols)
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/converter.py", line 143, in init
    raw_graph, params_dict, self.inputs, cut_at_symbols
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 241, in init
    new_node = InternalTorchIRNode(raw_node, parent=self)
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 142, in init
    for name in node.attributeNames()
    File "Library/Python/3.7/lib/python/site-packages/coremltools/converters/mil/frontend/torch/internal_graph.py", line 142, in
    for name in node.attributeNames()
    AttributeError: 'torch._C.Node' object has no attribute 'ival'

And that error is working on a constant which has 'value' but not 'ival '.

@hovhanns
Copy link

hovhanns commented Jul 30, 2020

The same issue occurred in my case.
Used segmentation_models_pytorch

Steps to reproduce

import segmentation_models_pytorch as smp
def Smp_Unet(out_channels=1):
    model = smp.Unet("efficientnet-b0", classes=out_channels, encoder_weights="imagenet")
    model.segmentation_head[2] = nn.ReLU(inplace=True)
    return model
model = Smp_Unet()
model.load_state_dict(model_state_dict)
model.encoder.set_swish(memory_efficient=False)
model.eval()
dummy_input = torch.rand(1, 3, 512, 512)
tranced_model = torch.jit.trace(model, dummy_input)
core_ml_model = ct.convert(tranced_model, inputs=[ct.ImageType(name="image_in")])

Getting the AttributeError


---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-13-719676927649> in <module>
      1 tranced_model = torch.jit.trace(model, dummy_input)
----> 2 core_ml_model = ct.convert(tranced_model, inputs=[ct.ImageType(name="image_in")])

~/.local/lib/python3.8/site-packages/coremltools/converters/_converters_entry.py in convert(model, source, inputs, outputs, classifier_config, minimum_deployment_target, **kwargs)
    290             raise ValueError("outputs must not be specified for PyTorch")
    291 
--> 292         proto_spec = _convert(
    293             model,
    294             convert_from="torch",

~/.local/lib/python3.8/site-packages/coremltools/converters/mil/converter.py in _convert(model, convert_from, convert_to, converter_registry, **kwargs)
    118     backend_converter = backend_converter_type()
    119 
--> 120     prog = frontend_converter(model, **kwargs)
    121     common_pass(prog)
    122     out = backend_converter(prog, **kwargs)

~/.local/lib/python3.8/site-packages/coremltools/converters/mil/converter.py in __call__(self, *args, **kwargs)
     60         from .frontend.torch import load
     61 
---> 62         return load(*args, **kwargs)
     63 
     64 

~/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/load.py in load(model_spec, debug, **kwargs)
     71     outputs = kwargs.get("outputs", None)
     72     cut_at_symbols = kwargs.get("cut_at_symbols", None)
---> 73     converter = TorchConverter(torchscript, inputs, outputs, cut_at_symbols)
     74 
     75     try:

~/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/converter.py in __init__(self, torchscript, inputs, outputs, cut_at_symbols)
    138         self.output_names = outputs
    139         self.context = TranscriptionContext()
--> 140         raw_graph, params_dict = self._expand_and_optimize_ir(self.torchscript)
    141         self.graph = InternalTorchIRGraph(
    142             raw_graph, params_dict, self.inputs, cut_at_symbols

~/.local/lib/python3.8/site-packages/coremltools/converters/mil/frontend/torch/converter.py in _expand_and_optimize_ir(torchscript)
    352         _torch._C._jit_pass_lint(graph)
    353         # Replaces a couple specific ops patterns (add, sub, mul, div, chunk).
--> 354         _torch._C._jit_pass_canonicalize_ops(graph)
    355         _torch._C._jit_pass_lint(graph)
    356         # From PyTorch code: This pass catches all of the small, easy to catch

AttributeError: module 'torch._C' has no attribute '_jit_pass_canonicalize_ops'

@leovinus2001
Copy link
Contributor Author

As we have seen no patch from the coremltools team, may I politely enquire whether coremltools plans to be compatible with PyTorch 1.6 going forward?

@aseemw
Copy link
Collaborator

aseemw commented Aug 7, 2020

Yes, we are currently working on resolving the issues with pytorch 1.6.0 and the next coremltools beta release will support it.

@akash1301
Copy link

Is there any way to avoid this error by downgrading the PyTorch version?

@leovinus2001
Copy link
Contributor Author

Maybe I misunderstand the question but PyTorch 1.5.1 works fine with coremltoolsv4.0b2 or TOT latest. I use "pip" to jump between forward and backward between sets of pytorch/torchvision etc of compatible versions. Going forward, we just need to know for sure that 1.6.x will work as well, and @aseemw confirmed earlier it will. It would be nice to know when that fix will drop of course, hint ;) And, please verify torchvision and audio as well with Pytorch 1.6, thanks.

@leovinus2001
Copy link
Contributor Author

leovinus2001 commented Aug 20, 2020

The 4.0b3 release of coremltools fixes this issue.

Birch-san pushed a commit to Birch-san/coremltools that referenced this issue Nov 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Unexpected behaviour that should be corrected (type) PyTorch (traced)
Projects
None yet
Development

No branches or pull requests

5 participants