Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problems in Inferring input shape in caffe_to_keras function in convert.py #34

Open
akshaychawla opened this issue Apr 4, 2017 · 0 comments

Comments

@akshaychawla
Copy link

Hey Marc!
I'm having some trouble with the convert.py function, specifically, "tuple(config.input_dim[1:])" gives me an empty list as an output.
To reproduce the error, I used ipdb and set a breakpoint ->

def caffe_to_keras(prototext, caffemodel, phase='train', debug=False)

    config = caffe.NetParameter()
    prototext = preprocessPrototxt(prototext, debug)
    text_format.Merge(prototext, config)

    if len(config.layers) != 0:
        raise Exception("Prototxt files V1 are not supported.")
        layers = config.layers[:]   # prototext V1
    elif len(config.layer) != 0:
        layers = config.layer[:]    # prototext V2
    else:
        raise Exception('could not load any layers from prototext')


    import ipdb; ipdb.set_trace()  # breakpoint 49be92bd //
    print("CREATING MODEL")
    model = create_model(layers,
                              0 if phase == 'train' else 1,
                              tuple(config.input_dim[1:]),
                             debug)

ipdb> config.input_dim[0:]
[]

`

I tried it with both debug.prototxt and train_val_for_keras.prototxt , and faced the same issue, any suggestions on how to fix it? I'm temporarily passing a fixed value of [None, 3, 500, 500] to the create model function.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant