Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

No model found in config file when trying to load converted model #25

Open
dchouren opened this issue Jan 13, 2017 · 4 comments
Open

No model found in config file when trying to load converted model #25

dchouren opened this issue Jan 13, 2017 · 4 comments

Comments

@dchouren
Copy link

dchouren commented Jan 13, 2017

I used caffe2keras to convert the VGG16-hybrid1365 caffe model to an h5 file. Conversion went fine and I used the caffemodel and prototxt found here: https://github.com/metalbubble/places365.

However, when I try to load_model, I get ValueError: No model found in config file.

>>> import keras.models
Using Theano backend.
>>> x = keras.models.load_model('keras/keras/caffe/models/Keras_model_weights.h5')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/tigress/dchouren/git_sources/keras/keras/models.py", line 140, in load_model
    raise ValueError('No model found in config file.')
ValueError: No model found in config file.

Theano and Keras are up to date.

Any idea why the model is not converted properly?

The only other related issue I could find was this, which didn't shed any light. fchollet/deep-learning-models#1


Converted with

python caffe2keras.py -load_path /tigress/dchouren/thesis/src/keras/keras/caffe/models/ -prototxt deploy_vgg16_hybrid1365.prototxt -caffemodel vgg16_hybrid1365.caffemodel
.
.
.
LOADING WEIGHTS
Finished converting model.
Storing model...
Finished storing the converted model to /tigress/dchouren/thesis/src/keras/keras/caffe/models/

@dchouren
Copy link
Author

dchouren commented Jan 14, 2017

Not sure if this is relevant, but I tried to do some basic inspection with h5py and have this:


>>> import h5py
>>> x = h5py.File('models/Keras_model_weights.h5', 'r')
>>> [y for y in x]
['conv1_1', 'conv1_1_zeropadding', 'conv1_2', 'conv1_2_zeropadding', 'conv2_1', 'conv2_1_zeropadding', 'conv2_2', 'conv2_2_zeropadding', 'conv3_1', 'conv3_1_zeropadding', 'conv3_2', 'conv3_2_zeropadding', 'conv3_3', 'conv3_3_zeropadding', 'conv4_1', 'conv4_1_zeropadding', 'conv4_2', 'conv4_2_zeropadding', 'conv4_3', 'conv4_3_zeropadding', 'conv5_1', 'conv5_1_zeropadding', 'conv5_2', 'conv5_2_zeropadding', 'conv5_3', 'conv5_3_zeropadding', 'data', 'drop6', 'drop7', 'fc6', 'fc6_flatten', 'fc7', 'fc8a', 'pool1', 'pool2', 'pool3', 'pool4', 'pool5', 'prob', 'relu1_1', 'relu1_2', 'relu2_1', 'relu2_2', 'relu3_1', 'relu3_2', 'relu3_3', 'relu4_1', 'relu4_2', 'relu4_3', 'relu5_1', 'relu5_2', 'relu5_3', 'relu6', 'relu7']

@MarcBS
Copy link
Owner

MarcBS commented Jan 20, 2017

Hi @dchouren , are you trying to load the model with the original Keras version? Or with this fork?

@dchouren
Copy link
Author

Have tried both with the same result

@dchouren
Copy link
Author

dchouren commented Feb 3, 2017

@MarcBS So it seems that what's being created is just a weights file. Is that correct? I can create a model, say VGG16, and then use model.load_weights('...') and that works. I would suggest changing the output from
'Finished storing the converted model...'
to indicate that this isn't an .h5 model file but rather just the layer weights so there's no confusion.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants