Skip to content

Bahar-BM/tflite-test

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 

Repository files navigation

tflite-test

This repo contains scripts and a tool to reproduce the openCL delegate issue with the tf.stack/Pack node.

Building and converting the model

  • model_files folder contains a very simple model containing a tf.stack node and its corresponding tflite version (FP32).
    • You can also use generate_dummy_model.py to build the model and use convert_model.py to convert it to tflite.

tflite_inference tool

We have implemented a small tool to feed an input to our sample model using openCL delegate and display the generated results.

PREREQUISITES:

  • Linux host computer
  • Connectivity to the target device via adb
  • Android NDK, version 22 or later
  • CMake 3.18 or later

BUILD INSTRUCTIONS

  • Unzip the tensorflow_lite_cpp_2_9_1_edited_static.zip file inside the tflite_inference_tool folder.
  • In a terminal, from tflite_inference_tool folder:
$ mkdir build
$ cd build
$ cmake -G "Unix Makefiles"
        -DCMAKE_SYSTEM_NAME=Android 
        -DANDROID_ABI=arm64-v8a 
        -DANDROID_STL=c++_shared 
        -DANDROID_NATIVE_API_LEVEL=27 
        -DCMAKE_VERBOSE_MAKEFILE=ON 
        -DCMAKE_TOOLCHAIN_FILE=<path-to-ndk>/build/cmake/android.toolchain.cmake 
        -DCMAKE_BUILD_TYPE=Release
        -DTensorFlowLite_ROOT=../tensorflow_lite_cpp_2_9_1_edited_static ..
$ make
  • Here, you must replace with the absolute path of the ndk installed on your computer. If you installed NDK through Android studio, it is typically located at: /home/<username>/Android/Sdk/ndk/<version>/ on Linux

  • tensorflow_lite_cpp_2_9_1_edited_static is TensorflowFlow Lite library (nightly version) package.

Run INSTRUCTIONS

WARNING: This step will write to your /data/local/tmp folder on device. Please make sure existing files in that folder are backed up as needed.

In a terminal, from tflite_inference_tool folder:

$ ./run_me.sh

The output should be something like this:

INFO: Created TensorFlow Lite delegate for GPU.
INFO: Initialized TensorFlow Lite runtime.
VERBOSE: Replacing 1 node(s) with delegate (TfLiteGpuDelegateV2) node, yielding 1 partitions.
INFO: Initialized OpenCL-based API.
INFO: Created 1 GPU delegate kernels.
0.679688, 0.924316, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published