Skip to content

Cannot infer shape for FusedBatchNornV3 #943

@ragavendrams

Description

@ragavendrams

Hi,

I am trying to convert a frozen graph to ONNX using tf2onnx programmatically and i run into some warnings. (same as the ones mentioned in 745). I am using the 1.5.6 version of tf2onnx which has the changes merged already.

log:

Cannot infer shape for conv1_1/conv1_1/bn/FusedBatchNormV3: conv1_1/conv1_1/bn/FusedBatchNormV3:5
Cannot infer shape for conv1_2/conv1_2/bn/FusedBatchNormV3: conv1_2/conv1_2/bn/FusedBatchNormV3:5
Cannot infer shape for conv2_1/1/conv2_1/1/bn/FusedBatchNormV3: conv2_1/1/conv2_1/1/bn/FusedBatchNormV3:5
Cannot infer shape for conv2_3/bn/FusedBatchNormV3: conv2_3/bn/FusedBatchNormV3:5
Cannot infer shape for conv2_3/1/conv2_3/1/bn/FusedBatchNormV3: conv2_3/1/conv2_3/1/bn/FusedBatchNormV3:5
Cannot infer shape for conv3_1/bn/FusedBatchNormV3: conv3_1/bn/FusedBatchNormV3:5
Cannot infer shape for conv3_1/1/conv3_1/1/bn/FusedBatchNormV3: conv3_1/1/conv3_1/1/bn/FusedBatchNormV3:5
Cannot infer shape for conv3_3/bn/FusedBatchNormV3: conv3_3/bn/FusedBatchNormV3:5
Cannot infer shape for conv3_3/1/conv3_3/1/bn/FusedBatchNormV3: conv3_3/1/conv3_3/1/bn/FusedBatchNormV3:5
Cannot infer shape for conv4_1/bn/FusedBatchNormV3: conv4_1/bn/FusedBatchNormV3:5
Cannot infer shape for conv4_1/1/conv4_1/1/bn/FusedBatchNormV3: conv4_1/1/conv4_1/1/bn/FusedBatchNormV3:5
Cannot infer shape for conv4_3/bn/FusedBatchNormV3: conv4_3/bn/FusedBatchNormV3:5
Cannot infer shape for conv4_3/1/conv4_3/1/bn/FusedBatchNormV3: conv4_3/1/conv4_3/1/bn/FusedBatchNormV3:5
Cannot infer shape for fc1/fc1/bn/FusedBatchNormV3: fc1/fc1/bn/FusedBatchNormV3:5

System config:
tf2onnx : 1.5.6
onnx: 1.7.0
TF 1.15
hardware : Jetson Tx2 - Linux aarch64

pb file link:
https://drive.google.com/file/d/12IMU7wLmsnDwynIuyUtbmhoRTeTyyFqS/view?usp=sharing

Minimal Code:

import os
import sys
import tensorflow as tf
import tf2onnx
import onnx

path = "models/mars.pb"
output_name = ['features']
dims = [3, 128, 64]

# Function to load frozen graph
def load_frozen_pb(path_to_pb):
    with tf.io.gfile.GFile(path_to_pb, "rb") as f:
        graph_def = tf.compat.v1.GraphDef()
        graph_def.ParseFromString(f.read())
    with tf.Graph().as_default() as g:
        tf.import_graph_def(graph_def, name='')
        return g

g = load_frozen_pb(path)
sess = tf.compat.v1.Session(graph=g)
onnx_graph = tf2onnx.tfonnx.process_tf_graph(sess.graph, input_names=["images:0"], output_names=["features:0"], opset=11)
model_proto = onnx_graph.make_model("test")
with open("model.onnx", "wb") as f:
    f.write(model_proto.SerializeToString())

model1 = onnx.load('model.onnx')
onnx.checker.check_model(model1)

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions