New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Converting Boosted tree model to tflite #50667
Comments
It looks like the ops are not supported in TensorFlow as well. You need to follow the steps as given here in |
Hi Meghna, Thanks for the reply i followed the link, and tried to use register_custom_opdefs : But i am getting this AttributeError Traceback (most recent call last) AttributeError: module 'tensorflow._api.v2.lite' has no attribute 'python' Can you please let me know if there is anything i can do? Thanks, |
@Koushik667 can you try running the following code?
@abattery (tagging you in case you have more context) If there is an op that is unsupported in TF, how can we convert the model successfully. The API used here ( |
Adding them into the allowlist in the Select TF op support will be the easiest way to support. I think the above Boost tree ops are not included in the allowlist but we have AsString op as Select TF ops. I think we can consider this as a feature request for adding the additions of Boost tree ops in the allowlist of the Select TF ops. Please consider using https://www.tensorflow.org/lite/guide/ops_select |
Thanks a lot @abattery! @Koushik667 In TFLite, we maintain a list of TF operators (called the allowlist) that don't have a corresponding TFLite operator implementation (eg: such as the BoostedTreesEnsembleResourceHandleOp). If a TF operator is in this allowlist, then we can convert the model by pulling in the TF op into a TFLite model by adding the constant
In your model, the operators used (eg: such as the BoostedTreesEnsembleResourceHandleOp) are not in this allowlist. Hence, though you set the above flag, it didn't work. To make this work, you need to follow the instructions in Update TFLite Ops Allowlist for TF Ops to add these operators to the allowlist. Once you complete this, your initial code should work as expected:
Note: once the update is made, you still need to follow instructions here when running inference to use these TF operators |
1. System information
2. Code
Option B: Paste your code here or provide a link to a custom end-to-end colab
`converter = tf.lite.TFLiteConverter.from_saved_model('edr_cust_model_new/1625736127/') # path to the SavedModel directory
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]
tflite_model = converter.convert()
// Save the model.
with open('edr_cust_model_temp/model.tflite', 'wb') as f:
f.write(tflite_model)`
3. Short Error log
I am getting following error while conversion:
ConverterError: :0: error: loc("boosted_trees"): 'tf.BoostedTreesEnsembleResourceHandleOp' op is neither a custom op nor a flex op
:0: error: loc("boosted_trees/BoostedTreesPredict"): 'tf.BoostedTreesPredict' op is neither a custom op nor a flex op
:0: error: loc("boosted_trees/head/predictions/str_classes"): 'tf.AsString' op is neither a custom op nor a flex op
:0: error: failed while converting: 'main': Ops that need custom implementation (enabled via setting the -emit-custom-ops flag):
tf.AsString {device = "", fill = "", precision = -1 : i64, scientific = false, shortest = false, width = -1 : i64}
tf.BoostedTreesEnsembleResourceHandleOp {container = "", device = "", shared_name = "boosted_trees/"}
tf.BoostedTreesPredict {device = "", logits_dimension = 7 : i64, num_bucketized_features = 18 : i64}
4. For detailed error, please refer to this file
tf_boosted_tree_convert_error_log.txt
The text was updated successfully, but these errors were encountered: