New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[ONNXRuntimeError] : 10 : INVALID_GRAPH : This is an invalid model. Error in Node:model/multi_category_encoding/AsString : No Op registered for AsString with domain_version of 9 #1645
Comments
Looks we don't support the AsString() op. Let me check if we can handle this in the converter. |
Is there a work around like custom op till we get the converter update please, Thanks |
I have some code that maps AsString to ONNX Case which kind of works but doesn't honor all attribute AsString has. |
worked for me for the structured_classifier example so we merged a PR: You can try with
|
@guschmue Thank you very much for quick response Now I am getting this error, No Op registered for LookupTableFindV2 with domain_version of 9
As before it works normally in a python file, but in a flask app throws error, |
What are the shapes and dtypes of |
X_train, y_train, X_valid, y_valid , (1056, 16) (1056,) (191, 16) (191,) respectively, all numpy.ndarray
|
Can you upload the pickles to OneDrive/GoogleDrive/Dropbox and post a link? Are those all np.int32 or np.float32? |
yes they are https://drive.google.com/drive/folders/1HfB00dOuk-awSmIrSg92hmJFYzTpQNCr?usp=sharing attached in google drive, you can open with
|
Great, I just requested access to the drive link. |
Actually as said before model created successfully in python file, and InferenceSession creates successful in python file InferenceSession throws error in flask app |
Ah, so sorry. Didn't catch that. What version of onnxruntime does the flask application use? |
Same kind of issue as in #1228 |
All same versions |
Can you please elaborate on this? Are you getting a "Default value of table lookup must be const." error? Are you running the conversion code within flask too, or just onnxruntime? You can save both the keras saved model and the onnx model with: ExportedautoKeras_model.save("autokerasmodel")
onnx_model, _ = tf2onnx.convert.from_keras(ExportedautoKeras_model, output_path="autokeras.onnx") I find it very surprising that you get different results in flask. Is your flask running from a different virtualenv? Are you sure your autokeras version is the same? |
Yes I am using same model for conversion to onnx Inside flask, creating the model and converting it, and trying to get the session results all at once |
I think it is very likely that the keras models you get in flask and the plain python script are different. Can you please add this line: |
Inside Flask App, I have two functions, one model creation and passing the model to onnxconverter function, not sure is that a issue, |
That should not be an issue. Again to confirm, are you using the same virtualenv for flask as the python script? |
yes, python and flask are in same env |
Is the training data you are using (X_train, y_train, X_valid, y_valid) the same values for both? |
Also in normal python file, onnxConversion and InferenceSession works
|
Are you able to capture the keras saved model from flask? |
Can you please confirm the prediction test. |
yes i can |
I am able to successfully run predictions using the onnx model I have generated. The model is uploaded to the shared drive folder as as
Awesome. Please capture and upload the keras saved models and converted onnx models for flask and the python script and upload them to the Google Drive folder as |
I have uploaded a "ONNXmodel.onnx" and "creditloan_prediction_20210806T210907" in the drive, can you please try to create a session from any of the two, both created in flask |
Both models give me the error:
But I will need a saved model to diagnose the cause of the conversion failure. If you are not able to upload a saved model due to privacy/security concerns, I can try to walk through the debugging on your end, or we can wait for @guschmue who might have better luck reproducing the issue with autokeras. |
Yes, thats the saved model from flask, and thats the error |
May I know these warnings affect anything
sorry, |
Awwww, thought we had fixed this. The models from python and flask are different. Keep in mind that autokeras can choose very different model architectures depending on the data it is given. I think it is likely you are training the python and flask models on different data. The "does not support precision, scientific and fill attributes for AsString" might or might not matter depending on how the lookup table is formatted. Can you upload the converted onnx model from flask again? |
The warning - we can't handle all attributes from AsString(), ie. instead of float 123. onnx would have float 123.000000. Not sure if it hurts in this case - it might if the category mapper is behind it because the lookup table would have the tf representation. |
Thanks for that, uploaded the converted onnx model in the drive. Regarding different results, actually I meant the python model in flask and the same model converted to onnx in flask, those two prediction results are different But if I build model in python file and convert to onnx, those prediction results are same, I'm very confused why is that |
Ah, do you get the same results between the flask keras model and the flask onnx model? I think you are almost certainly getting different models in flask and python. I'm not sure why, but I suspect you are giving autokeras different input data or different args. |
yeah, here
This is inside Flask |
Not sure whether those warnings make any difference |
The category mapper in the model looks like: That's likely not going to work. That said, I find the whole thing a little strange since the result is immediately cast back to float. Seems highly unlikely that this lookup table is useful. @hanzigs are you using real testing data on this? Do you find that the TF model produces useful results on non-training data? Also are you certain you are running the python script with the same data, args, and virtual environment as the flask app? |
Reg: same data, args, and virtual environment, yes I'm sure about that. because, this flask app has got 7 models, keras seq, lgbm, xgb, randomforest, extratrees, decisiontree and autokeras, all other 6 models working perfect, same way the data, args are passed, so I'm sure those are correct in flask app. Reg cast back to float, not sure what's that, but testing data is correct, May I understand what will be the problem for the above please |
There is no complex code happening in autokeras
|
Category Fields are normalized using woe transformation, Numeric fields are normalized using MinMaxScalar, separately |
Testing data transformation follows the same steps of normalization for prediction |
Flask app is bit complex to take out a miniature version, because it is linked the database which is elasticsearch on each and every step, thats why I can't send the flask app code. |
The issue is that the input to the CategoryMapper (lookup table) comes from an AsString op in TF, which converts a number to a string. There is no corresponding op in ONNX, so we convert to a Cast, but that won't necessarily use the same precision. aka 0.0 becomes "0.0" not "0.00000". The lookup for 0.0 will return 1 not 5 and the results may be different. If we were doing int to string, it would be consistent, but float to string is more problematic. |
Is the data from the database used to train the autokeras model? |
Yes it is |
How does the python script get the data then? What data does it use? |
python elasticsearch client to pull the data |
Can you pickle the data from each and compare that they are identical? |
yes i can |
the prediction testing happens from POSTMAN |
But if I build model step by step in a python file by calling only functions of flask app and test, it works fine |
Anyway will check that, Thanks for the support, much appreciated, You can close this ticket. |
Hi @TomWildenhain-Microsoft The 4th model having the catmapper build from flask app, not giving correct result |
Hi @TomWildenhain-Microsoft |
Hi, https://drive.google.com/drive/folders/1HfB00dOuk-awSmIrSg92hmJFYzTpQNCr?usp=sharing Let me know whether its possible to convert, Thanks |
Below code works perfect when run in python file (python==3.9.5, tensorflow==2.5.0, keras2onnx==1.7.0, onnxruntime==1.8.0,
keras==2.4.3, tf2onnx==1.9.1)
Same code inside Flask App, InferenceSession throws error
If that's a converter bug, how should I find the correct opset? (I have tried opset from 9 to 13, all throws error) then why that error not raised in standalone run?
Any help please, Thanks
The text was updated successfully, but these errors were encountered: