Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Runtime error when using Edge TPU with Pi 4 #68

Closed
tomarnison opened this issue Mar 5, 2020 · 0 comments
Closed

Runtime error when using Edge TPU with Pi 4 #68

tomarnison opened this issue Mar 5, 2020 · 0 comments

Comments

@tomarnison
Copy link

tomarnison commented Mar 5, 2020

Hi, I am trying to get the EDGE TPU working on Raspberry Pi 4 and I am having the same issue as a few others (#44) it seems, despite updating to tflite-runtime 2.1.0.post1 and making sure I use tflite_runtime instead of tf.lite.interpreter.

Error message:

(tflite1) pi@raspberrypi:~/tflite1 $ python3 TFLite_detection_webcam.py --modeldir=mobilenet_quantized_1205 --edgetpu
INFO: Initialized TensorFlow Lite runtime.
/home/pi/tflite1/mobilenet_quantized_1205/edgetpu.tflite
Traceback (most recent call last):
  File "TFLite_detection_webcam.py", line 140, in <module>
    interpreter.allocate_tensors()
  File "/home/pi/.virtualenvs/tflite1/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 244, in allocate_tensors
    return self._interpreter.AllocateTensors()
  File "/home/pi/.virtualenvs/tflite1/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
    return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.

Output of grep edgetpu:

(tflite1) pi@raspberrypi:~/tflite1 $ dpkg -l | grep edgetpu
ii  libedgetpu1-std:armhf                 13.0                    mhf        Support library for Edge TPU

Any Advice welcome, I'm new to this so it could be I've made a blind error somewhere.

Thanks

*Following this tutorial: https://github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/Raspberry_Pi_Guide.md#step-1e-run-the-tensorflow-lite-model

UPDATE SOLVED***
I noticed that this line calls the tensorflow interpreter not the tflit_runtime interpreter . Given that I do have tensorflow installed I assume it was calling that interpreter, so by commenting out that option I forced it to use the runtime interpreter

pkg = importlib.util.find_spec('tensorflow')
if pkg is None:
from tflite_runtime.interpreter import Interpreter
if use_TPU:
    from tflite_runtime.interpreter import load_delegate
else:
    from tensorflow.lite.python.interpreter import Interpreter
    if use_TPU:
        from tensorflow.lite.python.interpreter import load_delegate
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants