Skip to content

some confusions when import official tensorflow models #7198

@aaltonenzhang

Description

@aaltonenzhang

When testing tensorflow models from tfhub at https://tfhub.dev/, I found issues when import tensorflow IRs.

  1. When I import saved model using TFParser.parse() and from_tensorflow(), I found that for some models, it's seems that the tags are not consistent with the real saved model.
  2. I found that import model from checkpoint is not supported yet, but will tvm support it in the future? And what if there are no meta data exported from the checkpoint? Is it mandatory to modify python code of each cases without meta data?
  3. Why does only constant values are supported for dims parameter of Fill operator? Will you support it later? I found this problem when I convert efficientnet saved model to be tflite format. Also, for the effiecientnet, the input[1] of the Fill operator doesn't have expr since it's the scalar(const) data to fill, but in tvm, it directly call the get_expr() api for it, which is not correct and will exit the program.
  4. Function not found - __inference_signature_wrapper_4615.

issues are listed below:

model name import result
efficientnet For dims parameter of Fill operator, only constant values are supported
retinanet StatefulPartitionedCall:6 is not in graph
albert StatefulPartitionedCall:6 is not in graph
bert StatefulPartitionedCall:6 is not in graph
ncf Function not found - __inference_signature_wrapper_4615

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions