-
Notifications
You must be signed in to change notification settings - Fork 3.8k
Closed
Description
When testing tensorflow models from tfhub at https://tfhub.dev/, I found issues when import tensorflow IRs.
- When I import saved model using TFParser.parse() and from_tensorflow(), I found that for some models, it's seems that the tags are not consistent with the real saved model.
- I found that import model from checkpoint is not supported yet, but will tvm support it in the future? And what if there are no meta data exported from the checkpoint? Is it mandatory to modify python code of each cases without meta data?
- Why does only constant values are supported for dims parameter of Fill operator? Will you support it later? I found this problem when I convert efficientnet saved model to be tflite format. Also, for the effiecientnet, the input[1] of the Fill operator doesn't have expr since it's the scalar(const) data to fill, but in tvm, it directly call the get_expr() api for it, which is not correct and will exit the program.
- Function not found - __inference_signature_wrapper_4615.
issues are listed below:
| model name | import result |
|---|---|
| efficientnet | For dims parameter of Fill operator, only constant values are supported |
| retinanet | StatefulPartitionedCall:6 is not in graph |
| albert | StatefulPartitionedCall:6 is not in graph |
| bert | StatefulPartitionedCall:6 is not in graph |
| ncf | Function not found - __inference_signature_wrapper_4615 |
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels