Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unexpected exception happened during extracting attributes for .. #371

Closed
sumsuddin opened this issue Jan 19, 2020 · 1 comment
Closed

Comments

@sumsuddin
Copy link

sumsuddin commented Jan 19, 2020

I was trying to convert a Caffe model using mo_caffe.py script. I always get errors like below, but in random nodes (all have the common "BatchNorm" op) . Trained the model using Nvidia-Digits (https://github.com/NVIDIA/DIGITS)

Model Optimizer arguments:
Common parameters:
        - Path to the Input Model:      /home/deploy.caffemodel
        - Path for generated IR:        /dldt/model-optimizer/.
        - IR output name:       deploy
        - Log level:    INFO
        - Batch:        Not specified, inherited from the model
        - Input layers:         Not specified, inherited from the model
        - Output layers:        Not specified, inherited from the model
        - Input shapes:         Not specified, inherited from the model
        - Mean values:  Not specified
        - Scale values:         Not specified
        - Scale factor:         Not specified
        - Precision of IR:      FP32
        - Enable fusing:        True
        - Enable grouped convolutions fusing:   True
        - Move mean values to preprocess section:       False
        - Reverse input channels:       False
Caffe specific parameters:
        - Path to Python Caffe* parser generated from caffe.proto:      mo/front/caffe/proto
        - Enable resnet optimization:   True
        - Path to the Input prototxt:   /home/deploy.prototxt
        - Path to CustomLayersMapping.xml:      Default
        - Path to a mean file:  Not specified
        - Offsets for a mean file:      Not specified
Model Optimizer version:        unknown version
[ INFO ]  Importing extensions from: /dldt/model-optimizer/mo
[ INFO ]  New subclass: <class 'mo.ops.crop.Crop'>
[ INFO ]  Registered a new subclass with key: Crop
[ INFO ]  New subclass: <class 'mo.ops.deformable_convolution.DeformableConvolution'>
[ INFO ]  Registered a new subclass with key: DeformableConvolution
[ INFO ]  New subclass: <class 'mo.ops.concat.Concat'>
[ INFO ]  Registered a new subclass with key: Concat
[ INFO ]  New subclass: <class 'mo.ops.split.Split'>
...
Some log info. I don't think anything interesting here.
...
[ WARNING ]  Skipped <class 'extensions.front.override_batch.OverrideBatch'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.TopKNormalize.TopKNormalize'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.reshape_dim_normalizer.ReshapeDimNormalizer'> registration because it was already registered or it was
disabled.
[ WARNING ]  Skipped <class 'extensions.front.ArgMaxSqueeze.ArgMaxSqueeze'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.standalone_const_eraser.StandaloneConstEraser'> registration because it was already registered or it wa
s disabled.
[ WARNING ]  Skipped <class 'extensions.front.TransposeOrderNormalizer.TransposeOrderNormalizer'> registration because it was already registered or i
t was disabled.
[ WARNING ]  Skipped <class 'mo.front.common.replacement.FrontReplacementOp'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.restore_ports.RestorePorts'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.softmax.SoftmaxFromKeras'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.reduce_axis_normalizer.ReduceAxisNormalizer'> registration because it was already registered or it was
disabled.
[ WARNING ]  Skipped <class 'extensions.front.freeze_placeholder_value.FreezePlaceholderValue'> registration because it was already registered or it
was disabled.
[ WARNING ]  Skipped <class 'extensions.front.no_op_eraser.NoOpEraser'> registration because it was already registered or it was disabled.
[ WARNING ]  Node attributes: {'_in_ports': {}, 'model_pb': name: "conv2_3_sep_bn_left"
type: "BatchNorm"
bottom: "conv2_3_sep_left"
top: "conv2_3_sep_left"
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
blobs {                                                                                                                                      [0/1963]
  shape {
    dim: 32
  }
}
blobs {
  shape {
    dim: 32
  }
}
blobs {
  shape {
    dim: 1
  }
}
phase: TRAIN
, 'kind': 'op', '_out_ports': {}, 'pb': name: "conv2_3_sep_bn_left"
type: "BatchNorm"
bottom: "conv2_3_sep_left"
top: "conv2_3_sep_left"
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
, 'type': 'Parameter'}
[ ERROR ]  Unexpected exception happened during extracting attributes for node relu1.
Original exception message: list index (0) out of range

And occasionally get the following bug report

[ WARNING ]  Skipped <class 'extensions.front.TransposeOrderNormalizer.TransposeOrderNormalizer'> registration because it was already registered or i
t was disabled.
[ WARNING ]  Skipped <class 'mo.front.common.replacement.FrontReplacementOp'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.restore_ports.RestorePorts'> registration because it was already registered or it was disabled.
[ WARNING ]  Skipped <class 'extensions.front.reduce_axis_normalizer.ReduceAxisNormalizer'> registration because it was already registered or it was
disabled.
[ WARNING ]  Skipped <class 'extensions.front.softmax.SoftmaxFromKeras'> registration because it was already registered or it was disabled.
[ WARNING ]  Node attributes: {'pb': name: "conv2_1_dw_bn_left"
type: "BatchNorm"
bottom: "conv2_1_dw_left"
top: "conv2_1_dw_left"
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
, 'model_pb': name: "conv2_1_dw_bn_left"
type: "BatchNorm"
bottom: "conv2_1_dw_left"
top: "conv2_1_dw_left"
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
param {
  lr_mult: 0.0
  decay_mult: 0.0
}
blobs {
  shape {
    dim: 8
  }
}
blobs {
  shape {
    dim: 8
  }
}
blobs {
  shape {
    dim: 1
  }
}
phase: TRAIN
, 'type': 'Parameter', 'kind': 'op', '_out_ports': {}, '_in_ports': {}}
[ ERROR ]  -------------------------------------------------
[ ERROR ]  ----------------- INTERNAL ERROR ----------------
[ ERROR ]  Unexpected exception happened.
[ ERROR ]  Please contact Model Optimizer developers and forward the following information:
[ ERROR ]  local variable 'new_attrs' referenced before assignment
[ ERROR ]  Traceback (most recent call last):
  File "/dldt/model-optimizer/mo/front/extractor.py", line 608, in extract_node_attrs
    supported, new_attrs = extractor(Node(graph, node))
  File "/dldt/model-optimizer/mo/pipeline/caffe.py", line 91, in <lambda>
    extract_node_attrs(graph, lambda node: caffe_extractor(node, check_for_duplicates(caffe_type_extractors)))
  File "/dldt/model-optimizer/mo/front/caffe/extractor.py", line 99, in caffe_extractor
    attrs = caffe_type_extractors[name](node)
  File "/dldt/model-optimizer/mo/front/caffe/extractor.py", line 36, in <lambda>
    return lambda node: pb_extractor(node.pb, node.model_pb)
  File "/dldt/model-optimizer/mo/front/caffe/extractors/batchnorm.py", line 52, in batch_norm_ext
    scale = blobs[2].data[0]
IndexError: list index (0) out of range

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/dldt/model-optim
```izer/mo/main.py", line 302, in main
    return driver(argv)
  File "/dldt/model-optimizer/mo/main.py", line 261, in driver
    custom_layers_mapping_path=custom_layers_mapping_path)
  File "/dldt/model-optimizer/mo/pipeline/caffe.py", line 91, in driver
    extract_node_attrs(graph, lambda node: caffe_extractor(node, check_for_duplicates(caffe_type_extractors)))
  File "/dldt/model-optimizer/mo/front/extractor.py", line 614, in extract_node_attrs
    new_attrs['name'] if 'name' in new_attrs else '<UNKNOWN>',
UnboundLocalError: local variable 'new_attrs' referenced before assignment

[ ERROR ]  ---------------- END OF BUG REPORT --------------
[ ERROR ]  -------------------------------------------------

Here are the model files extracted and zipped in both format

https://drive.google.com/drive/folders/1yjRrDRIpO0nzIENYbaD_IOuRXVCvbEpH?usp=sharing

Other Caffe model converts with no issues. I think this error is specific to this particular model

@sumsuddin
Copy link
Author

Fixed here

redradist pushed a commit to redradist/openvino that referenced this issue Oct 6, 2023
)

* Improve model cache dir deletion

- Use rmtree to delete model cache dir
- Create more descriptive and safer directory name (to prevent clashes with
existing directories called "model")
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant