New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add to documentation: how to install Tensorflow Lite onto Raspberry Pi 3B+ Raspian Stretch #19110

Closed
jdlamstein opened this Issue May 5, 2018 · 7 comments

Comments

Projects
None yet
8 participants
@jdlamstein
Copy link

jdlamstein commented May 5, 2018

System information

  • Have I written custom code (as opposed to using a stock example script provided in TensorFlow):
    No.
  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04):
    Raspberry Pi 3B+, Raspian Stretch
  • TensorFlow installed from (source or binary):
    What I'm asking about.
  • TensorFlow version (use command below):
    Tensorflow Version 1.7.0
  • Python version:
    Python 3.5
  • Bazel version (if compiling from source):
    What I'm asking about.
  • GCC/Compiler version (if compiling from source):
    What I'm asking about.
  • CUDA/cuDNN version:
    N/A
  • GPU model and memory:
    Broadcom VideoCore IV @ 250 MHz (BCM2837: 3D part of GPU @ 300 MHz, video part of GPU @ 400 MHz)
  • Exact command to reproduce:
    Addition to documentation: Installation of Tensorflow Lite on Raspberry Pi 3B+ Raspian Stretch.

Describe the problem

I would like instructions on how to install Tensorflow Lite onto Raspberry Pi 3B+ with Raspbian Stretch OS. To the best of my knowledge, the documentation doesn't yet cover installation for Raspbian Stretch. I posted on the Raspberry Pi Stack Exchange (https://raspberrypi.stackexchange.com/questions/83498/how-do-i-install-tensorflow-lite-on-raspbian-stretch) and I was directed here. My goal is to deploy a Tensorflow neural network onto the Raspberry Pi 3B+ with Tensorflow Lite.

@raygeeknyc

This comment has been minimized.

Copy link

raygeeknyc commented May 28, 2018

I’ve got a project using the tf nightly for Pi, thanks for that, with the Object Detection API but inference time is ~1.5 secs; I expect that I could use tf lite instead and experience significantly better performance. Any ETA for RPi support?

@tensorflowbutler

This comment has been minimized.

Copy link
Member

tensorflowbutler commented Jun 11, 2018

Nagging Assignee @petewarden: It has been 14 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@tensorflowbutler

This comment has been minimized.

Copy link
Member

tensorflowbutler commented Jun 26, 2018

Nagging Assignee @petewarden: It has been 29 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@tensorflowbutler

This comment has been minimized.

Copy link
Member

tensorflowbutler commented Jul 11, 2018

Nagging Assignee @petewarden: It has been 44 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@tensorflowbutler

This comment has been minimized.

Copy link
Member

tensorflowbutler commented Jul 26, 2018

Nagging Assignee @petewarden: It has been 59 days with no activity and this issue has an assignee. Please update the label and/or status accordingly.

@tofulawrence

This comment has been minimized.

Copy link

tofulawrence commented Aug 9, 2018

Resolved with nightly. For latency issue, please use quantized model, which is fast. @achowdhery.

@achowdhery achowdhery self-assigned this Aug 9, 2018

@jdlamstein

This comment has been minimized.

Copy link
Author

jdlamstein commented Aug 9, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment