Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tensorflow Lite issue on Raspberry Pi #21855

Open
abhi-rf opened this issue Aug 24, 2018 · 14 comments
Assignees
Labels

Comments

@abhi-rf
Copy link

@abhi-rf abhi-rf commented Aug 24, 2018

Hi Everyone. I am working on running some scripts on Tensorflow on Raspberry Pi 3. When I try to import a tflite model into the tflite interpreter, it throws an error. Please help :) :)

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):No
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):Raspbian Stretch
  • Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device:Rasp Pi 3
  • TensorFlow installed from (source or binary):Binary
  • TensorFlow version (use command below):1.9
  • Python version:2.7
  • Bazel version (if compiling from source):NA
  • GCC/Compiler version (if compiling from source):NA
  • CUDA/cuDNN version:NA
  • GPU model and memory:NA
  • Exact command to reproduce:-

Source code / logs

###THE CODE###
interpreter = tf.contrib.lite.Interpreter(model_path='mobilenet_v1_0.25_128_quant.tflite')

###ERROR###

/home/pi/.local/lib/python2.7/site-packages/tensorflow/python/framework/tensor_util.py:32: RuntimeWarning: numpy.dtype size changed, may indicate binary incompatibility. Expected 56, got 52
from tensorflow.python.framework import fast_tensor_util
Traceback (most recent call last):
File "mobilenet_int_tflite.py", line 24, in
interpreter = tf.contrib.lite.Interpreter(model_path='mobilenet_v1_0.25_128_quant.tflite')
File "/home/pi/.local/lib/python2.7/site-packages/tensorflow/contrib/lite/python/interpreter.py", line 50, in init
_interpreter_wrapper.InterpreterWrapper_CreateWrapperCPPFromFile(
File "/home/pi/.local/lib/python2.7/site-packages/tensorflow/python/util/lazy_loader.py", line 53, in getattr
module = self._load()
File "/home/pi/.local/lib/python2.7/site-packages/tensorflow/python/util/lazy_loader.py", line 42, in _load
module = importlib.import_module(self.name)
File "/usr/lib/python2.7/importlib/init.py", line 37, in import_module
import(name)
File "/home/pi/.local/lib/python2.7/site-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 28, in
_tensorflow_wrap_interpreter_wrapper = swig_import_helper()
File "/home/pi/.local/lib/python2.7/site-packages/tensorflow/contrib/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 24, in swig_import_helper
_mod = imp.load_module('_tensorflow_wrap_interpreter_wrapper', fp, pathname, description)
ImportError: /home/pi/.local/lib/python2.7/site-packages/tensorflow/contrib/lite/python/interpreter_wrapper/_tensorflow_wrap_interpreter_wrapper.so: undefined symbol: _ZN6tflite12tensor_utils39NeonMatrixBatchVectorMultiplyAccumulateEPKfiiS2_iPfi

@freedomtan

This comment has been minimized.

Copy link
Contributor

@freedomtan freedomtan commented Aug 27, 2018

Same problem as #21574

@rockyrhodes

This comment has been minimized.

Copy link

@rockyrhodes rockyrhodes commented Sep 14, 2018

@abhi-rf: Agree with @freedomtan that this seems like the same problem as #21574. In that thread, it is suggested that upgrading to 1.10 fixed the problem. Have you tried this?

@DanielhCarranza

This comment has been minimized.

Copy link

@DanielhCarranza DanielhCarranza commented Oct 4, 2018

I have the same issue with raspbian 9.0, tensorflow 1.9.0, python 3.5.3 on raspberry pi 3B. I tried to solve it installing tensorflow 1.10.0 and even though i did it, it's still showing the same problem. did anyone solve it?

@DanielhCarranza

This comment has been minimized.

Copy link

@DanielhCarranza DanielhCarranza commented Oct 18, 2018

I'am still having the same problem

@rockyrhodes rockyrhodes assigned petewarden and aselle and unassigned rockyrhodes Oct 22, 2018
@PINTO0309

This comment has been minimized.

Copy link

@PINTO0309 PINTO0309 commented Oct 23, 2018

#23082

@petewarden petewarden removed their assignment Feb 23, 2019
@harsh020goyal

This comment has been minimized.

Copy link

@harsh020goyal harsh020goyal commented Apr 9, 2019

I have the same issue with raspbian 9.0, tensorflow 1.13.0, python 3.5.3 on raspberry pi 3B.

@pedronietogft

This comment has been minimized.

Copy link

@pedronietogft pedronietogft commented May 27, 2019

I am having the same issue with Tensorflow 1.13 as well

@Drkstr

This comment has been minimized.

Copy link

@Drkstr Drkstr commented Jun 9, 2019

Me too. TF1.13.1 on Pi3B.

Any suggestions of how we might go about resolving this?

@jingw222

This comment has been minimized.

Copy link

@jingw222 jingw222 commented Jul 4, 2019

I have exactly the same issue with TensorFlow 1.13.1 on Raspberry Pi running Python 3.6.8. Here is the output:

Traceback (most recent call last):
  File "tflite.py", line 20, in <module>
    interpreter = tf.lite.Interpreter(MODEL_PATH)
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/lite/python/interpreter.py", line 54, in __init__
    _interpreter_wrapper.InterpreterWrapper_CreateWrapperCPPFromFile(
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/python/util/lazy_loader.py", line 61, in __getattr__
    module = self._load()
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/python/util/lazy_loader.py", line 44, in _load
    module = importlib.import_module(self.__name__)
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/importlib/__init__.py", line 126, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
  File "<frozen importlib._bootstrap>", line 994, in _gcd_import
  File "<frozen importlib._bootstrap>", line 971, in _find_and_load
  File "<frozen importlib._bootstrap>", line 955, in _find_and_load_unlocked
  File "<frozen importlib._bootstrap>", line 665, in _load_unlocked
  File "<frozen importlib._bootstrap_external>", line 678, in exec_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 28, in <module>
    _tensorflow_wrap_interpreter_wrapper = swig_import_helper()
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/tensorflow_wrap_interpreter_wrapper.py", line 24, in swig_import_helper
    _mod = imp.load_module('_tensorflow_wrap_interpreter_wrapper', fp, pathname, description)
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/imp.py", line 243, in load_module
    return load_dynamic(name, filename, file)
  File "/home/pi/.pyenv/versions/3.6.8/lib/python3.6/imp.py", line 343, in load_dynamic
    return _load(spec)
  File "<frozen importlib._bootstrap>", line 684, in _load
  File "<frozen importlib._bootstrap>", line 658, in _load_unlocked
  File "<frozen importlib._bootstrap>", line 571, in module_from_spec
  File "<frozen importlib._bootstrap_external>", line 922, in create_module
  File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
ImportError: /home/pi/.pyenv/versions/3.6.8/lib/python3.6/site-packages/tensorflow/lite/python/interpreter_wrapper/_tensorflow_wrap_interpreter_wrapper.so: undefined symbol: _ZN6tflite12tensor_utils24NeonVectorScalarMultiplyEPKaifPf

Any ideas on how to sort this out?

@harshgoyal020

This comment has been minimized.

Copy link

@harshgoyal020 harshgoyal020 commented Jul 4, 2019

Install the tensorflow using wheel file.

@jingw222

This comment has been minimized.

Copy link

@jingw222 jingw222 commented Jul 4, 2019

@harshgoyal020 which specific wheel build? Mind telling me more?

@harshgoyal020

This comment has been minimized.

Copy link

@harshgoyal020 harshgoyal020 commented Jul 4, 2019

Get the wheel file from link https://github.com/lhelontra/tensorflow-on-arm/releases/download/v1.13.1/tensorflow-1.13.1-cp35-none-linux_armv7l.whl

Then install it using command
pip3 install tensorflow-1.13.1-cp35-none-linux_armv7l.whl

@jingw222

This comment has been minimized.

Copy link

@jingw222 jingw222 commented Jul 4, 2019

@harshgoyal020 thanks a bunch. I'll give it a try. Given that the community builds work flawlessly, does that mean there's a bug in official builds that needs to be fixed somehow down the line?

@harshgoyal020

This comment has been minimized.

Copy link

@harshgoyal020 harshgoyal020 commented Jul 4, 2019

Official builds has less support for Raspberry Pi as compared to build from scrach using bazel. But bazel build takes lots of time. So i tried using wheel file, I got sucesss with no error. Community need to update

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
You can’t perform that action at this time.