diff --git a/tensorflow/lite/g3doc/_book.yaml b/tensorflow/lite/g3doc/_book.yaml index 4c0005ee722d97..a89ce6e007be45 100644 --- a/tensorflow/lite/g3doc/_book.yaml +++ b/tensorflow/lite/g3doc/_book.yaml @@ -177,13 +177,10 @@ upper_tabs: path: /lite/guide/build_android - title: "Build for iOS" path: /lite/guide/build_ios - - title: "Build for ARM64" - path: /lite/guide/build_arm64 - - title: "Build for Raspberry Pi" - path: /lite/guide/build_rpi + - title: "Build for ARM" + path: /lite/guide/build_arm - title: "Build with CMake" path: /lite/guide/build_cmake - status: experimental section: - title: "Cross compilation for ARM" path: /lite/guide/build_cmake_arm diff --git a/tensorflow/lite/g3doc/guide/build_arm.md b/tensorflow/lite/g3doc/guide/build_arm.md new file mode 100644 index 00000000000000..11fd35cc359851 --- /dev/null +++ b/tensorflow/lite/g3doc/guide/build_arm.md @@ -0,0 +1,105 @@ +# Build TensorFlow Lite for ARM boards + +This page describes how to build the TensorFlow Lite libraries for ARM-based +computers. + +TensorFlow Lite supports two build systems and supported features from each +build system are not idential. Check the following table to pick a proper build +system. + +Feature | Bazel | CMake +----------------------------------------------------------------------------------------- | ---------------------------- | ----- +Predefined toolchains | armhf, aarch64 | armel, armhf, aarch64 +Custom toolchains | harder to use | easy to use +[Select TF ops](https://www.tensorflow.org/lite/guide/ops_select) | supported | not supported +[GPU delegate](https://www.tensorflow.org/lite/performance/gpu) | only available for Android | any platform that supports OpenCL +XNNPack | supported | supported +[Python Wheel](https://www.tensorflow.org/lite/guide/build_cmake_pip) | supported | supported +[C API](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/c/README.md) | supported | [supported](https://www.tensorflow.org/lite/guide/build_cmake#build_tensorflow_lite_c_library) +[C++ API](https://www.tensorflow.org/lite/guide/inference#load_and_run_a_model_in_c) | supported for Bazel projects | supported for CMake projects + +## Cross-compilation for ARM with CMake + +If you have a CMake project or if you want to use a custom toolchain, you'd +better use CMake for cross compilation. There is a separate +[Cross compilation TensorFlow Lite with CMake](https://www.tensorflow.org/lite/guide/build_cmake_arm) +page available for this. + +## Cross-compilation for ARM with Bazel + +If you have a Bazel project or if you want to use TF ops, you'd better use Bazel +build system. You'll use the integrated +[ARM GCC 8.3 toolchains](https://github.com/tensorflow/tensorflow/tree/master/third_party/toolchains/embedded/arm-linux) +with Bazel to build an ARM32/64 shared library. + +| Target Architecture | Bazel Configureation | Compatibile Devices | +| ------------------- | ----------------------- | -------------------------- | +| armhf (ARM32) | --config=elinux_armhf | RPI3, RPI4 with 32 bit | +: : : Raspberry Pi OS : +| AArch64 (ARM64) | --config=elinux_aarch64 | Coral, RPI4 with Ubuntu 64 | +: : : bit : + +Note: The generated shared library requires glibc 2.28 or higher to run. + +The following instructions have been tested on Ubuntu 16.04.3 64-bit PC (AMD64) +and TensorFlow devel docker image +[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). + +To cross compile TensorFlow Lite with Bazel, follow the steps: + +#### Step 1. Install Bazel + +Bazel is the primary build system for TensorFlow. Install the latest version of +the [Bazel build system](https://bazel.build/versions/master/docs/install.html). + +**Note:** If you're using the TensorFlow Docker image, Bazel is already +available. + +#### Step 2. Clone TensorFlow repository + +```sh +git clone https://github.com/tensorflow/tensorflow.git tensorflow_src +``` + +**Note:** If you're using the TensorFlow Docker image, the repo is already +provided in `/tensorflow_src/`. + +#### Step 3. Build ARM binary + +##### C library + +```bash +bazel build --config=elinux_aarch64 -c opt //tensorflow/lite/c:libtensorflowlite_c.so +``` + +You can find a shared library in: +`bazel-bin/tensorflow/lite/c/libtensorflowlite_c.so`. + +**Note:** Use `elinux_armhf` for +[32bit ARM hard float](https://wiki.debian.org/ArmHardFloatPort) build. + +Check +[TensorFlow Lite C API](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/c/README.md) +page for the detail. + +##### C++ library + +```bash +bazel build --config=elinux_aarch64 -c opt //tensorflow/lite:libtensorflowlite.so +``` + +You can find a shared library in: +`bazel-bin/tensorflow/lite/libtensorflowlite.so`. + +Currently, there is no straightforward way to extract all header files needed, +so you must include all header files in tensorflow/lite/ from the TensorFlow +repository. Additionally, you will need header files from FlatBuffers and +Abseil. + +##### Etc + +You can also build other Bazel targets with the toolchain. Here are some useful +targets. + +* //tensorflow/lite/tools/benchmark:benchmark_model +* //tensorflow/lite/examples/label_image:label_image diff --git a/tensorflow/lite/g3doc/guide/build_arm64.md b/tensorflow/lite/g3doc/guide/build_arm64.md deleted file mode 100644 index 9d5fbb38c437bf..00000000000000 --- a/tensorflow/lite/g3doc/guide/build_arm64.md +++ /dev/null @@ -1,146 +0,0 @@ -# Build TensorFlow Lite for ARM64 boards - -This page describes how to build the TensorFlow Lite static and shared libraries -for ARM64-based computers. If you just want to start using TensorFlow Lite to -execute your models, the fastest option is to install the TensorFlow Lite -runtime package as shown in the [Python quickstart](python.md). - -Note: This page shows how to compile only the C++ static and shared libraries -for TensorFlow Lite. Alternative install options include: -[install just the Python interpreter API](python.md) (for inferencing only); -[install the full TensorFlow package from pip](https://www.tensorflow.org/install/pip); -or -[build the full TensorFlow package](https://www.tensorflow.org/install/source). - -**Note:** Cross-compile ARM with CMake is available. Please check -[this](https://www.tensorflow.org/lite/guide/build_cmake_arm). - -## Cross-compile for ARM64 with Make - -To ensure the proper build environment, we recommend using one of our TensorFlow -Docker images such as -[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). - -To get started, install the toolchain and libs: - -```bash -sudo apt-get update -sudo apt-get install crossbuild-essential-arm64 -``` - -If you are using Docker, you may not use `sudo`. - -Now git-clone the TensorFlow repository -(https://github.com/tensorflow/tensorflow)—if you're using the TensorFlow Docker -image, the repo is already provided in `/tensorflow_src/`—and then run this -script at the root of the TensorFlow repository to download all the build -dependencies: - -```bash -./tensorflow/lite/tools/make/download_dependencies.sh -``` - -Note that you only need to do this once. - -Then compile: - -```bash -./tensorflow/lite/tools/make/build_aarch64_lib.sh -``` - -This should compile a static library in: -`tensorflow/lite/tools/make/gen/linux_aarch64/lib/libtensorflow-lite.a`. - -## Compile natively on ARM64 - -These steps were tested on HardKernel Odroid C2, gcc version 5.4.0. - -Log in to your board and install the toolchain: - -```bash -sudo apt-get install build-essential -``` - -Now git-clone the TensorFlow repository -(https://github.com/tensorflow/tensorflow) and run this at the root of the -repository: - -```bash -./tensorflow/lite/tools/make/download_dependencies.sh -``` - -Note that you only need to do this once. - -Then compile: - -```bash -./tensorflow/lite/tools/make/build_aarch64_lib.sh -``` - -This should compile a static library in: -`tensorflow/lite/tools/make/gen/linux_aarch64/lib/libtensorflow-lite.a`. - -## Cross-compile for ARM64 with Bazel - -You can use -[ARM GCC toolchains](https://github.com/tensorflow/tensorflow/tree/master/third_party/toolchains/embedded/arm-linux) -with Bazel to build an ARM64 shared library. - -Note: The generated shared library requires glibc 2.28 or higher to run. - -The following instructions have been tested on Ubuntu 16.04.3 64-bit PC (AMD64) -and TensorFlow devel docker image -[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). - -To cross compile TensorFlow Lite with Bazel, follow the steps: - -#### Step 1. Install Bazel - -Bazel is the primary build system for TensorFlow. Install the latest version of -the [Bazel build system](https://bazel.build/versions/master/docs/install.html). - -**Note:** If you're using the TensorFlow Docker image, Bazel is already -available. - -#### Step 2. Clone TensorFlow repository - -```sh -git clone https://github.com/tensorflow/tensorflow.git tensorflow_src -``` - -**Note:** If you're using the TensorFlow Docker image, the repo is already -provided in `/tensorflow_src/`. - -#### Step 3. Build ARM64 binary - -##### C library - -```bash -bazel build --config=elinux_aarch64 -c opt //tensorflow/lite/c:libtensorflowlite_c.so -``` - -Check -[TensorFlow Lite C API](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/c) -page for the detail. - -##### C++ library - -```bash -bazel build --config=elinux_aarch64 -c opt //tensorflow/lite:libtensorflowlite.so -``` - -You can find a shared library in: -`bazel-bin/tensorflow/lite/libtensorflowlite.so`. - -Currently, there is no straightforward way to extract all header files needed, -so you must include all header files in tensorflow/lite/ from the TensorFlow -repository. Additionally, you will need header files from FlatBuffers and -Abseil. - -##### Etc - -You can also build other Bazel targets with the toolchain. Here are some useful -targets. - -* //tensorflow/lite/tools/benchmark:benchmark_model -* //tensorflow/lite/examples/label_image:label_image diff --git a/tensorflow/lite/g3doc/guide/build_rpi.md b/tensorflow/lite/g3doc/guide/build_rpi.md deleted file mode 100644 index c837fb37abeb5e..00000000000000 --- a/tensorflow/lite/g3doc/guide/build_rpi.md +++ /dev/null @@ -1,185 +0,0 @@ -# Build TensorFlow Lite for Raspberry Pi - -This page describes how to build the TensorFlow Lite static and shared libraries -for Raspberry Pi. If you just want to start using TensorFlow Lite to execute -your models, the fastest option is to install the TensorFlow Lite runtime -package as shown in the [Python quickstart](python.md). - -**Note:** This page shows how to compile the C++ static and shared libraries for -TensorFlow Lite. Alternative install options include: -[install just the Python interpreter API](python.md) (for inferencing only); -[install the full TensorFlow package from pip](https://www.tensorflow.org/install/pip); -or -[build the full TensorFlow package](https://www.tensorflow.org/install/source_rpi). - -**Note:** This page only covers 32-bit builds. If you're looking for 64-bit -builds, check [Build for ARM64](build_arm64.md) page. - -**Note:** Cross-compile ARM with CMake is available. Please check -[this](https://www.tensorflow.org/lite/guide/build_cmake_arm). - -## Cross-compile for Raspberry Pi with Make - -The following instructions have been tested on Ubuntu 16.04.3 64-bit PC (AMD64) -and TensorFlow devel docker image -[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). - -To cross compile TensorFlow Lite follow the steps: - -#### Step 1. Clone official Raspberry Pi cross-compilation toolchain - -```sh -git clone https://github.com/raspberrypi/tools.git rpi_tools -``` - -#### Step 2. Clone TensorFlow repository - -```sh -git clone https://github.com/tensorflow/tensorflow.git tensorflow_src -``` - -**Note:** If you're using the TensorFlow Docker image, the repo is already -provided in `/tensorflow_src/`. - -#### Step 3. Run following script at the root of the TensorFlow repository to download - -all the build dependencies: - -```sh -cd tensorflow_src && ./tensorflow/lite/tools/make/download_dependencies.sh -``` - -**Note:** You only need to do this once. - -#### Step 4a. To build ARMv7 binary for Raspberry Pi 2, 3 and 4 - -```sh -PATH=../rpi_tools/arm-bcm2708/arm-rpi-4.9.3-linux-gnueabihf/bin:$PATH \ - ./tensorflow/lite/tools/make/build_rpi_lib.sh -``` - -**Note:** This should compile a static library in: -`tensorflow/lite/tools/make/gen/rpi_armv7l/lib/libtensorflow-lite.a`. - -You can add additional Make options or target names to the `build_rpi_lib.sh` -script since it's a wrapper of Make with TFLite -[Makefile](https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/tools/make/Makefile). -Here are some possible options: - -```sh -./tensorflow/lite/tools/make/build_rpi_lib.sh clean # clean object files -./tensorflow/lite/tools/make/build_rpi_lib.sh -j 16 # run with 16 jobs to leverage more CPU cores -./tensorflow/lite/tools/make/build_rpi_lib.sh label_image # # build label_image binary -``` - -#### Step 4b. To build ARMv6 binary for Raspberry Pi Zero - -```sh -PATH=../rpi_tools/arm-bcm2708/arm-rpi-4.9.3-linux-gnueabihf/bin:$PATH \ - ./tensorflow/lite/tools/make/build_rpi_lib.sh TARGET_ARCH=armv6 -``` - -**Note:** This should compile a static library in: -`tensorflow/lite/tools/make/gen/rpi_armv6/lib/libtensorflow-lite.a`. - -## Compile natively on Raspberry Pi - -The following instructions have been tested on Raspberry Pi Zero, Raspberry Pi -OS GNU/Linux 10 (Buster), gcc version 8.3.0 (Raspbian 8.3.0-6+rpi1): - -To natively compile TensorFlow Lite follow the steps: - -#### Step 1. Log in to your Raspberry Pi and install the toolchain - -```sh -sudo apt-get install build-essential -``` - -#### Step 2. Clone TensorFlow repository - -```sh -git clone https://github.com/tensorflow/tensorflow.git tensorflow_src -``` - -#### Step 3. Run following script at the root of the TensorFlow repository to download all the build dependencies - -```sh -cd tensorflow_src && ./tensorflow/lite/tools/make/download_dependencies.sh -``` - -**Note:** You only need to do this once. - -#### Step 4. You should then be able to compile TensorFlow Lite with: - -```sh -./tensorflow/lite/tools/make/build_rpi_lib.sh -``` - -**Note:** This should compile a static library in: -`tensorflow/lite/tools/make/gen/lib/rpi_armv6/libtensorflow-lite.a`. - -## Cross-compile for armhf with Bazel - -You can use -[ARM GCC toolchains](https://github.com/tensorflow/tensorflow/tree/master/third_party/toolchains/embedded/arm-linux) -with Bazel to build an armhf shared library which is compatible with Raspberry -Pi 2, 3 and 4. - -Note: The generated shared library requires glibc 2.28 or higher to run. - -The following instructions have been tested on Ubuntu 16.04.3 64-bit PC (AMD64) -and TensorFlow devel docker image -[tensorflow/tensorflow:devel](https://hub.docker.com/r/tensorflow/tensorflow/tags/). - -To cross compile TensorFlow Lite with Bazel, follow the steps: - -#### Step 1. Install Bazel - -Bazel is the primary build system for TensorFlow. Install the latest version of -the [Bazel build system](https://bazel.build/versions/master/docs/install.html). - -**Note:** If you're using the TensorFlow Docker image, Bazel is already -available. - -#### Step 2. Clone TensorFlow repository - -```sh -git clone https://github.com/tensorflow/tensorflow.git tensorflow_src -``` - -**Note:** If you're using the TensorFlow Docker image, the repo is already -provided in `/tensorflow_src/`. - -#### Step 3. Build ARMv7 binary for Raspberry Pi 2, 3 and 4 - -##### C library - -```bash -bazel build --config=elinux_armhf -c opt //tensorflow/lite/c:libtensorflowlite_c.so -``` - -Check -[TensorFlow Lite C API](https://github.com/tensorflow/tensorflow/tree/master/tensorflow/lite/c) -page for the detail. - -##### C++ library - -```bash -bazel build --config=elinux_armhf -c opt //tensorflow/lite:libtensorflowlite.so -``` - -You can find a shared library in: -`bazel-bin/tensorflow/lite/libtensorflowlite.so`. - -Currently, there is no straightforward way to extract all header files needed, -so you must include all header files in tensorflow/lite/ from the TensorFlow -repository. Additionally, you will need header files from FlatBuffers and -Abseil. - -##### Etc - -You can also build other Bazel targets with the toolchain. Here are some useful -targets. - -* //tensorflow/lite/tools/benchmark:benchmark_model -* //tensorflow/lite/examples/label_image:label_image diff --git a/tensorflow/lite/g3doc/guide/python.md b/tensorflow/lite/g3doc/guide/python.md index 65754b3c0da8bd..6e21186e4225b0 100644 --- a/tensorflow/lite/g3doc/guide/python.md +++ b/tensorflow/lite/g3doc/guide/python.md @@ -109,3 +109,6 @@ If you're using a Coral ML accelerator, check out the To convert other TensorFlow models to TensorFlow Lite, read about the the [TensorFlow Lite Converter](../convert/). + +If you want to build `tflite_runtime` wheel, read +[Build TensorFlow Lite Python Wheel Package](build_cmake_pip.md)