You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to get the EDGE TPU working on Raspberry Pi 4 and I am having the same issue as a few others (#44) it seems, despite updating to tflite-runtime 2.1.0.post1 and making sure I use tflite_runtime instead of tf.lite.interpreter.
Error message:
(tflite1) pi@raspberrypi:~/tflite1 $ python3 TFLite_detection_webcam.py --modeldir=mobilenet_quantized_1205 --edgetpu
INFO: Initialized TensorFlow Lite runtime.
/home/pi/tflite1/mobilenet_quantized_1205/edgetpu.tflite
Traceback (most recent call last):
File "TFLite_detection_webcam.py", line 140, in <module>
interpreter.allocate_tensors()
File "/home/pi/.virtualenvs/tflite1/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter.py", line 244, in allocate_tensors
return self._interpreter.AllocateTensors()
File "/home/pi/.virtualenvs/tflite1/lib/python3.7/site-packages/tensorflow_core/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 106, in AllocateTensors
return _tensorflow_wrap_interpreter_wrapper.InterpreterWrapper_AllocateTensors(self)
RuntimeError: Internal: Unsupported data type in custom op handler: 0Node number 2 (EdgeTpuDelegateForCustomOp) failed to prepare.
Output of grep edgetpu:
(tflite1) pi@raspberrypi:~/tflite1 $ dpkg -l | grep edgetpu
ii libedgetpu1-std:armhf 13.0 mhf Support library for Edge TPU
Any Advice welcome, I'm new to this so it could be I've made a blind error somewhere.
UPDATE SOLVED***
I noticed that this line calls the tensorflow interpreter not the tflit_runtime interpreter . Given that I do have tensorflow installed I assume it was calling that interpreter, so by commenting out that option I forced it to use the runtime interpreter
pkg = importlib.util.find_spec('tensorflow')
if pkg is None:
from tflite_runtime.interpreter import Interpreter
if use_TPU:
from tflite_runtime.interpreter import load_delegate
else:
from tensorflow.lite.python.interpreter import Interpreter
if use_TPU:
from tensorflow.lite.python.interpreter import load_delegate
The text was updated successfully, but these errors were encountered:
Hi, I am trying to get the EDGE TPU working on Raspberry Pi 4 and I am having the same issue as a few others (#44) it seems, despite updating to
tflite-runtime 2.1.0.post1
and making sure I use tflite_runtime instead of tf.lite.interpreter.Error message:
Output of grep edgetpu:
Any Advice welcome, I'm new to this so it could be I've made a blind error somewhere.
Thanks
*Following this tutorial: https://github.com/EdjeElectronics/TensorFlow-Lite-Object-Detection-on-Android-and-Raspberry-Pi/blob/master/Raspberry_Pi_Guide.md#step-1e-run-the-tensorflow-lite-model
UPDATE SOLVED***
I noticed that this line calls the tensorflow interpreter not the tflit_runtime interpreter . Given that I do have tensorflow installed I assume it was calling that interpreter, so by commenting out that option I forced it to use the runtime interpreter
The text was updated successfully, but these errors were encountered: