Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

how to build tensorflow lite into a static c++ library using android ndk #14688

Closed
huanyingjun opened this issue Nov 18, 2017 · 14 comments
Closed
Assignees
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower

Comments

@huanyingjun
Copy link

I want to write some c++ test binary using tensorflow lite.
from the README.md I can only see how to build the demo app.
Could you please tell me how to build tensorflow lite into a static library using android ndk?

@miaout17 miaout17 added the comp:lite TF Lite related issues label Nov 20, 2017
@shaurya0
Copy link
Contributor

I recently managed to do this myself for armeabi-v7a. Here is what I had to do:

  • Download Android NDK
  • Set the environment variable, e.g: export NDK_ROOT=~/AndroidNDK/android-ndk-r14b/
  • I had to compile the static library libcpufeatures.a in android-ndk-r14b/sources/android/cpufeatures
  • From the tensorflow directory run make -f tensorflow/contrib/lite/Makefile TARGET=ANDROID ANDROID_ARCH=armeabi-v7a`

I made some other changes and added a file:
https://gist.github.com/shaurya0/eac838b390df3461b972e966b015a3a2

@shivaniag
Copy link
Contributor

@andrehentz could you please take a look.

@shivaniag shivaniag added the stat:awaiting tensorflower Status - Awaiting response from tensorflower label Nov 21, 2017
@huanyingjun
Copy link
Author

@shaurya0
I tried with your method, I can successfully build the libtensorflow-lite.a.
But when I link libtensorflow-lite.a with my test binary, a lot of errors

[armeabi-v7a] Executable : test_tflite
jni/libtensorflow-lite.a(register.o):register.cc:function tflite::ops::builtin::BuiltinOpResolver::FindOp(char const*) const: error: undefined reference to 'std::_Hash_bytes(void const*, unsigned int, unsigned int)'
jni/libtensorflow-lite.a(register.o):register.cc:function tflite::ops::builtin::BuiltinOpResolver::BuiltinOpResolver(): error: undefined reference to 'std::__detail::_Prime_rehash_policy::_M_next_bkt(unsigned int) const'
jni/libtensorflow-lite.a(register.o):register.cc:function tflite::ops::builtin::BuiltinOpResolver::BuiltinOpResolver(): error: undefined reference to 'std::__detail::_Prime_rehash_policy::_M_next_bkt(unsigned int) const'
jni/libtensorflow-lite.a(register.o):register.cc:function tflite::ops::builtin::BuiltinOpResolver::AddCustom(char const*, TfLiteRegistration*): error: undefined reference to 'std::_Hash_bytes(void const*, unsigned int, unsigned int)'
jni/libtensorflow-lite.a(register.o):register.cc:function std::_Hashtable<tflite::BuiltinOperator, std::pair<tflite::BuiltinOperator const, TfLiteRegistration*>, std::allocator<std::pair<tflite::BuiltinOperator const, TfLiteRegistration*> >, std::__detail::_Select1st, std::equal_totflite::BuiltinOperator, tflite::ops::builtin::BuiltinOpResolver::BuiltinOperatorHasher, std::__detail::_Mod_range_hashing, std::__detail::_Default_ranged_hash, std::__detail::_Prime_rehash_policy, std::__detail::_Hashtable_traits<true, false, true> >::_M_insert_unique_node(unsigned int, unsigned int, std::__detail::_Hash_node<std::pair<tflite::BuiltinOperator const, TfLiteRegistration*>, true>): error: undefined reference to 'std::__detail::_Prime_rehash_policy::_M_need_rehash(unsigned int, unsigned int, unsigned int) const'
jni/libtensorflow-lite.a(register.o):register.cc:function std::_Hashtable<std::string, std::pair<std::string const, TfLiteRegistration
>, std::allocator<std::pair<std::string const, TfLiteRegistration*> >, std::__detail::_Select1st, std::equal_tostd::string, std::hashstd::string, std::__detail::_Mod_range_hashing, std::__detail::_Default_ranged_hash, std::__detail::_Prime_rehash_policy, std::__detail::_Hashtable_traits<true, false, true> >::_M_insert_unique_node(unsigned int, unsigned int, std::__detail::_Hash_node<std::pair<std::string const, TfLiteRegistration*>, true>): error: undefined reference to 'std::__detail::_Prime_rehash_policy::_M_need_rehash(unsigned int, unsigned int, unsigned int) const'
jni/libtensorflow-lite.a(svdf.o):svdf.cc:function _GLOBAL__sub_I_svdf.cc: error: undefined reference to 'std::ios_base::Init::Init()'
jni/libtensorflow-lite.a(svdf.o):svdf.cc:function _GLOBAL__sub_I_svdf.cc: error: undefined reference to 'std::ios_base::Init::~Init()'
jni/libtensorflow-lite.a(nnapi_delegate.o):nnapi_delegate.cc:function tflite::addTensorOperands(tflite::Interpreter
, ANeuralNetworksModel*): error: undefined reference to '__dynamic_cast'
jni/libtensorflow-lite.a(simple_memory_arena.o):simple_memory_arena.cc:function tflite::SimpleMemoryArena::Allocate(TfLiteContext*, unsigned int, unsigned int, tflite::ArenaAlloc*): error: undefined reference to 'std::__detail::_List_node_base::_M_hook(std::__detail::_List_node_base*)'
jni/libtensorflow-lite.a(simple_memory_arena.o):simple_memory_arena.cc:function tflite::SimpleMemoryArena::Deallocate(TfLiteContext*, tflite::ArenaAlloc const&): error: undefined reference to 'std::__detail::_List_node_base::_M_unhook()'
jni/libtensorflow-lite.a(embedding_lookup.o):embedding_lookup.cc:function _GLOBAL__sub_I_embedding_lookup.cc: error: undefined reference to 'std::ios_base::Init::Init()'
jni/libtensorflow-lite.a(embedding_lookup.o):embedding_lookup.cc:function _GLOBAL__sub_I_embedding_lookup.cc: error: undefined reference to 'std::ios_base::Init::~Init()'
jni/libtensorflow-lite.a(hashtable_lookup.o):hashtable_lookup.cc:function _GLOBAL__sub_I_hashtable_lookup.cc: error: undefined reference to 'std::ios_base::Init::Init()'
jni/libtensorflow-lite.a(hashtable_lookup.o):hashtable_lookup.cc:function _GLOBAL__sub_I_hashtable_lookup.cc: error: undefined reference to 'std::ios_base::Init::~Init()'
jni/libtensorflow-lite.a(lstm.o):lstm.cc:function _GLOBAL__sub_I_lstm.cc: error: undefined reference to 'std::ios_base::Init::Init()'
jni/libtensorflow-lite.a(lstm.o):lstm.cc:function _GLOBAL__sub_I_lstm.cc: error: undefined reference to 'std::ios_base::Init::~Init()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::Barrier::Notify() [clone .part.60]: error: undefined reference to 'std::condition_variable::notify_all()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::Barrier::Notify() [clone .part.60]: error: undefined reference to 'std::condition_variable::notify_all()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function tflite::multithreaded_ops::GetThreadPoolDevice(): error: undefined reference to 'std::condition_variable::condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function tflite::multithreaded_ops::GetThreadPoolDevice(): error: undefined reference to 'std::condition_variable::condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function tflite::multithreaded_ops::GetThreadPoolDevice(): error: undefined reference to 'std::condition_variable::~condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function tflite::multithreaded_ops::GetThreadPoolDevice(): error: undefined reference to 'std::thread::_M_start_thread(std::shared_ptrstd::thread::_Impl_base)'
jni/libtensorflow-lite.a(conv.o):conv.cc:function tflite::multithreaded_ops::GetThreadPoolDevice(): error: undefined reference to 'std::condition_variable::~condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function std::_Function_handler<void (int, int), EigenForTFLite::ThreadPoolDevice::parallelFor(int, EigenForTFLite::TensorOpCost const&, std::function<int (int)>, std::function<void (int, int)>) const::{lambda(int, int)#1}>::_M_invoke(std::_Any_data const&, int, int): error: undefined reference to 'std::condition_variable::notify_all()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function std::_Function_handler<void (int, int), EigenForTFLite::ThreadPoolDevice::parallelFor(int, EigenForTFLite::TensorOpCost const&, std::function<int (int)>, std::function<void (int, int)>) const::{lambda(int, int)#1}>::_M_invoke(std::_Any_data const&, int, int): error: undefined reference to 'std::condition_variable::notify_all()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::EventCount::Unpark(EigenForTFLite::EventCount::Waiter*) [clone .isra.58]: error: undefined reference to 'std::condition_variable::notify_one()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::NonBlockingThreadPoolTemplEigenForTFLite::StlThreadEnvironment::~NonBlockingThreadPoolTempl(): error: undefined reference to 'std::thread::join()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::NonBlockingThreadPoolTemplEigenForTFLite::StlThreadEnvironment::~NonBlockingThreadPoolTempl(): error: undefined reference to 'std::condition_variable::~condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::Barrier::Wait(): error: undefined reference to 'std::condition_variable::wait(std::unique_lockstd::mutex&)'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::ThreadPoolDevice::parallelFor(int, EigenForTFLite::TensorOpCost const&, std::function<int (int)>, std::function<void (int, int)>) const: error: undefined reference to 'std::condition_variable::condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::ThreadPoolDevice::parallelFor(int, EigenForTFLite::TensorOpCost const&, std::function<int (int)>, std::function<void (int, int)>) const: error: undefined reference to 'std::condition_variable::~condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::NonBlockingThreadPoolTemplEigenForTFLite::StlThreadEnvironment::WaitForWork(EigenForTFLite::EventCount::Waiter*, EigenForTFLite::StlThreadEnvironment::Task*): error: undefined reference to 'std::condition_variable::wait(std::unique_lockstd::mutex&)'
jni/libtensorflow-lite.a(conv.o):conv.cc:function EigenForTFLite::NonBlockingThreadPoolTemplEigenForTFLite::StlThreadEnvironment::WorkerLoop(int): error: undefined reference to 'std::_Hash_bytes(void const*, unsigned int, unsigned int)'
jni/libtensorflow-lite.a(conv.o):conv.cc:function void EigenForTFLite::TensorEvaluator<EigenForTFLite::TensorContractionOp<std::array<EigenForTFLite::IndexPair, 1u> const, EigenForTFLite::TensorMap<EigenForTFLite::Tensor<float const, 2, 1, int>, 16, EigenForTFLite::MakePointer> const, EigenForTFLite::TensorMap<EigenForTFLite::Tensor<float const, 2, 1, int>, 16, EigenForTFLite::MakePointer> const> const, EigenForTFLite::ThreadPoolDevice>::evalProduct<true, true, false, 0>(float*) const: error: undefined reference to 'std::condition_variable::condition_variable()'
jni/libtensorflow-lite.a(conv.o):conv.cc:typeinfo for int (int): error: undefined reference to 'vtable for __cxxabiv1::__function_type_info'

@shaurya0
Copy link
Contributor

I've been able to build a binary that links to libtensorflow-lite.a within the tensorflow lite source tree and outside of it. Judging by the linker errors it looks like the problem is with how you are linking libstdc++.

@jakiechris
Copy link

jakiechris commented Nov 22, 2017

jus want to ask : can libtensorflow-lite.a do predictions only on android phones ?
and is this lib much more smaller ?

@vnsmurthysristi
Copy link

Can you please share the steps to build tflite static binary for android C++

@JaviBonilla
Copy link

I tried to follow the mentioned steps but I am getting an error:

fatal error: tensorflow/core/lib/core/error_codes.pb.h: No such file or directory

I think this is a file generated by Bazel, is it required? These are the steps I followed, I am new to TFLite, so probably I am missing something.

  1. Clone repository
  2. Set NDK_ROOT
  3. Download dependencies: tensorflow/contrib/lite/download_dependencies.sh
  4. Make from TensorFlow directory: make -f tensorflow/contrib/lite/Makefile TARGET=ANDROID ANDROID_ARCH=armeabi-v7a

@andrehentz
Copy link
Contributor

Bazel should not be required and your are not missing anything :) Unfortunately we introduced a dependency to benchmark_model which isn't easily resolved with Make. The fix is #19019

@kargarisaac
Copy link

I want to write a c++ code to run tensorflow models on android and ios. bothe of them support c++ API so I want a single file. tflite ot .pb file. can anybody help?

@jefhai
Copy link

jefhai commented Nov 21, 2018

@kargarisaac did you figure it out? What steps did you take?

@andrehentz
Copy link
Contributor

@kargarisaac Just to clarify: do you want a single .tflite file that can be interpreted both on android and ios? That should be possible. What's not working for you? Could you open a separate issue to track that?

@jefhai
Copy link

jefhai commented Nov 26, 2018

@andrehentz Do you have any C++ examples of Tensorflow Lite and Android NDK?
I'm looking for more information that can help me develop an Android App using the NDK.
I'm also really looking for how you access the .tflite file path so you can build the flat buffer model in C++.
I know this is an off topic ask- no direct message feature though...

@andrehentz
Copy link
Contributor

The api doc describes how to write your C++ program to access a .tflite file and run inference. On android you will most likely need to interface with Java, which you can do via our own JNI code. If you prefer, TF Lite provides Java APIs too.

@kargarisaac
Copy link

@jefhai @andrehentz
Sorry for late response. No I couldn't do that. I used the freezed .pb file for android and converted it to use it in coreml for ios.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues stat:awaiting tensorflower Status - Awaiting response from tensorflower
Projects
None yet
Development

No branches or pull requests

10 participants