Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tensorflowlite v2.3 + XNNPACK run into error {ModifyGraphWithDelegate is disallowed} #42757

Closed
honglh opened this issue Aug 29, 2020 · 5 comments
Assignees
Labels
comp:lite TF Lite related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:bug Bug

Comments

@honglh
Copy link

honglh commented Aug 29, 2020

Please go to Stack Overflow for help and support:

https://stackoverflow.com/questions/tagged/tensorflow

If you open a GitHub issue, here is our policy:

  1. It must be a bug, a feature request, or a significant problem with the
    documentation (for small docs fixes please send a PR instead).
  2. The form below must be filled out.
  3. It shouldn't be a TensorBoard issue. Those go
    here.

Here's why we have that policy: TensorFlow developers respond to issues. We want to focus on work that benefits the whole community, e.g., fixing bugs and adding features. Support only helps individuals. GitHub also notifies thousands of people when issues are filed. We want them to see you communicating an interesting problem, rather than being redirected to Stack Overflow.


System information

  • Have I written custom code (as opposed to using a stock example script
    provided in TensorFlow)
    : No
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Raspbian buster 32-bit
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue
    happens on a mobile device
    : RPI2
  • TensorFlow installed from (source or binary): source
  • TensorFlow version (use command below): v2.3.0
  • Python version: N/A
  • Bazel version (if compiling from source): 3.1.0
  • GCC/Compiler version (if compiling from source): 7.5
  • CUDA/cuDNN version: N/A
  • GPU model and memory: N/A
  • Exact command to reproduce: N/A

You can collect some of this information using our environment capture script:

https://github.com/tensorflow/tensorflow/tree/master/tools/tf_env_collect.sh

You can obtain the TensorFlow version with:

python -c "import tensorflow as tf; print(tf.version.GIT_VERSION, tf.version.VERSION)"

Describe the problem

I built tensorflow v2.3.0 with XNNPack enabled --define tflite_with_xnnpack=true However, when I try to inference with coco_ssd_mobilenet_v1_1.0_quant_2018_06_29.tflite, the starter model from example page, i see this error

INFO: Created TensorFlow Lite XNNPACK delegate for CPU.
ERROR: ModifyGraphWithDelegate is disallowed when graph is immutable.

The XNNPACK thus was not effective during inference.

Is this error expected? Is so if there a CPU model that can be used with XNNPACK? (Right now it seems there is a dilemma that the object_dection optimization detailed here is not yet available in tensorflow v2.3.0 but XNNPACK only works with v2.3.0, making coco_ssd_mobilenet_v1_1.0_quant_2018_06_29.tflite seeming the only CPU object detection model available at this point.

Source code / logs

Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached. Try to provide a reproducible test case that is the bare minimum necessary to generate the problem.

@honglh honglh changed the title tensorflowlite + XNNPACK run into error {ModifyGraphWithDelegate is disallowed} tensorflowlite v2.3 + XNNPACK run into error {ModifyGraphWithDelegate is disallowed} Aug 29, 2020
@ravikyram ravikyram added comp:lite TF Lite related issues TF 2.3 Issues related to TF 2.3 type:support Support issues labels Aug 31, 2020
@ravikyram ravikyram assigned ymodak and unassigned ravikyram Aug 31, 2020
@Maratyszcza
Copy link
Contributor

cc @multiverse-tf

@sushreebarsa
Copy link
Contributor

sushreebarsa commented Dec 16, 2022

@honglh
We see that you're using the old version of TF (2.3.). TF 2.3 does not really contain all the changes to prevent code using invalid shapes from crashing the interpreter via CHECK-failures. TF 2.3 is no longer supported.Its unlikely for TF 2.3 version to receive any bug fixes except when we have security patches. There is a high possibility that this was fixed with later TF versions. Could you refer to this doc and try with the latest tf version 2.11.0 for your case and let us know if it is still an issue? If the issue still persists then kindly provide us the standalone code to reproduce this issue. Thank you!

@tilakrayal
Copy link
Contributor

@honglh,
Whenever you are trying bazel build -c opt --define tflite_with_xnnpack=true tensorflow/lite/tools/benchmark:benchmark_model it will compile the binary that will apply xnnpack delegate by default.

When you don't use --define tflite_with_xnnpack=true when building the benchmark tool, you could simply use --use_xnnpack=trueto check the xnnpack delegate performance.

And the error might be because we tried to apply xnnpack delegate twice in this case (i.e. the fist is triggered by setting "--use_xnnpack=true", and the second is triggered by applying xnnpack delegate by default (i.e. setting "--define tflite_with_xnnpack=true" when compiling the benchmark tool).

Also please take a look at this comment from the developer for the similiar issue. Thank you!

@tilakrayal tilakrayal added the stat:awaiting response Status - Awaiting response from author label Mar 13, 2023
@tilakrayal tilakrayal self-assigned this Mar 15, 2023
@tilakrayal tilakrayal added the stale This label marks the issue/pr stale - to be closed automatically if no activity label Mar 24, 2023
@github-actions
Copy link

github-actions bot commented Apr 1, 2023

This issue was closed because it has been inactive for 7 days since being marked as stale. Please reopen if you'd like to work on this further.

@github-actions github-actions bot closed this as completed Apr 1, 2023
@google-ml-butler
Copy link

Are you satisfied with the resolution of your issue?
Yes
No

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stale This label marks the issue/pr stale - to be closed automatically if no activity stat:awaiting response Status - Awaiting response from author TF 2.3 Issues related to TF 2.3 type:bug Bug
Projects
None yet
Development

No branches or pull requests

7 participants