Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Converting Boosted tree model to tflite #50667

Closed
Koushik667 opened this issue Jul 8, 2021 · 7 comments
Closed

Converting Boosted tree model to tflite #50667

Koushik667 opened this issue Jul 8, 2021 · 7 comments
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.4 for issues related to TF 2.4 TFLiteConverter For issues related to TFLite converter type:bug Bug

Comments

@Koushik667
Copy link

Koushik667 commented Jul 8, 2021

1. System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): mac os Big Sur
  • TensorFlow installation (pip package or built from source): pip package
  • TensorFlow library (version, if pip package or github SHA, if built from source): 2.4.1

2. Code

Option B: Paste your code here or provide a link to a custom end-to-end colab

`converter = tf.lite.TFLiteConverter.from_saved_model('edr_cust_model_new/1625736127/') # path to the SavedModel directory
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]

tflite_model = converter.convert()

// Save the model.
with open('edr_cust_model_temp/model.tflite', 'wb') as f:
f.write(tflite_model)`

3. Short Error log

I am getting following error while conversion:

ConverterError: :0: error: loc("boosted_trees"): 'tf.BoostedTreesEnsembleResourceHandleOp' op is neither a custom op nor a flex op
:0: error: loc("boosted_trees/BoostedTreesPredict"): 'tf.BoostedTreesPredict' op is neither a custom op nor a flex op
:0: error: loc("boosted_trees/head/predictions/str_classes"): 'tf.AsString' op is neither a custom op nor a flex op
:0: error: failed while converting: 'main': Ops that need custom implementation (enabled via setting the -emit-custom-ops flag):
tf.AsString {device = "", fill = "", precision = -1 : i64, scientific = false, shortest = false, width = -1 : i64}
tf.BoostedTreesEnsembleResourceHandleOp {container = "", device = "", shared_name = "boosted_trees/"}
tf.BoostedTreesPredict {device = "", logits_dimension = 7 : i64, num_bucketized_features = 18 : i64}

4. For detailed error, please refer to this file

tf_boosted_tree_convert_error_log.txt

@Koushik667 Koushik667 added the TFLiteConverter For issues related to TFLite converter label Jul 8, 2021
@UsharaniPagadala UsharaniPagadala added comp:lite TF Lite related issues TF 2.4 for issues related to TF 2.4 type:bug Bug labels Jul 8, 2021
@ymodak ymodak assigned MeghnaNatraj and unassigned ymodak Jul 9, 2021
@ymodak ymodak added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Jul 9, 2021
@MeghnaNatraj
Copy link
Member

It looks like the ops are not supported in TensorFlow as well. You need to follow the steps as given here in 2. Unsupported in TensorFlow

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

@Koushik667
Copy link
Author

Hi Meghna, Thanks for the reply

i followed the link, and tried to use register_custom_opdefs : But i am getting this


AttributeError Traceback (most recent call last)
in
4
5 # Register custom opdefs before the invocation of converter API.
----> 6 tf.lite.python.convert.register_custom_opdefs([custom_opdef])

AttributeError: module 'tensorflow._api.v2.lite' has no attribute 'python'

Can you please let me know if there is anything i can do?

Thanks,
Koushik

@MeghnaNatraj
Copy link
Member

@Koushik667 can you try running the following code?
(This usually wouldn't work as your ops are not defined in TF, but let's try)

converter = tf.lite.TFLiteConverter.from_saved_model('edr_cust_model_new/1625736127/') # path to the SavedModel directory
converter.allow_custom_ops = True
tflite_model = converter.convert()

@abattery (tagging you in case you have more context) If there is an op that is unsupported in TF, how can we convert the model successfully. The API used here ( tf.lite.python.convert.register_custom_opdefs([custom_opdef])) is not available to external users. It also looks like we have deprecated the older converter.custom_opdefs=... usage.

@MeghnaNatraj MeghnaNatraj reopened this Jul 14, 2021
@abattery
Copy link
Contributor

Adding them into the allowlist in the Select TF op support will be the easiest way to support. I think the above Boost tree ops are not included in the allowlist but we have AsString op as Select TF ops. I think we can consider this as a feature request for adding the additions of Boost tree ops in the allowlist of the Select TF ops.

Please consider using https://www.tensorflow.org/lite/guide/ops_select

@MeghnaNatraj
Copy link
Member

MeghnaNatraj commented Jul 14, 2021

Thanks a lot @abattery!

@Koushik667 In TFLite, we maintain a list of TF operators (called the allowlist) that don't have a corresponding TFLite operator implementation (eg: such as the BoostedTreesEnsembleResourceHandleOp). If a TF operator is in this allowlist, then we can convert the model by pulling in the TF op into a TFLite model by adding the constant tf.lite.OpsSet.SELECT_TF_OPS. (Note: this will lead to an increased model size) as follows:

converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]

In your model, the operators used (eg: such as the BoostedTreesEnsembleResourceHandleOp) are not in this allowlist. Hence, though you set the above flag, it didn't work. To make this work, you need to follow the instructions in Update TFLite Ops Allowlist for TF Ops to add these operators to the allowlist. Once you complete this, your initial code should work as expected:

converter = tf.lite.TFLiteConverter.from_saved_model('edr_cust_model_new/1625736127/') # path to the SavedModel directory
converter.target_spec.supported_ops = [
tf.lite.OpsSet.TFLITE_BUILTINS, # enable TensorFlow Lite ops.
tf.lite.OpsSet.SELECT_TF_OPS # enable TensorFlow ops.
]

tflite_model = converter.convert()

// Save the model.
with open('edr_cust_model_temp/model.tflite', 'wb') as f:
f.write(tflite_model)

Note: once the update is made, you still need to follow instructions here when running inference to use these TF operators

@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower TF 2.4 for issues related to TF 2.4 TFLiteConverter For issues related to TFLite converter type:bug Bug
Projects
None yet
Development

No branches or pull requests

5 participants