-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Build an android tensorflow binary with support DT_BOOL #3
Comments
Did you build own native library? |
No not yet... I tried to use the gpu version provided by miguel. But there is something else going wrong.. Can anyone provide the built native library for bool support? I suppose I am not the only one having trouble with that. Would spare me a lot of time getting started with building the c-lib myself. Thanks for any help :) |
I builted bool supported native lib. Google Drive And you can find gpu supported binaries in
If you want to get dll yourself, copy |
Thanks man! Your new built native library is working for me :) |
Great! |
Hi @gmlwns2000 , I have tried your built libraries and it's working for me. And I'm pretty interested how to build libraries like that as well. May you share how to make it ? Thanks |
Hi @hgffly ,
My solution is not official and not clear too, if you guys have some better solution for building native library for android, please share how to make it :) |
Hi gmlwns2000,
Appreciate you for sharing how to build the library for android
But I encountered some errors in step 3:
ERROR:
/home/wilson/.cache/bazel/_bazel_wilson/416ee1e5c1dd220e496a2567f04ee5b5/external/protobuf/BUILD:113:1:
C++ compilation of rule '@protobuf//:protobuf' failed: false failed: error
executing command /bin/false -MD -MF
bazel-out/stub_armeabi-v7a-py3-opt/bin/external/protobuf/_objs/protobuf/external/protobuf/src/google/protobuf/wrappers.pb.pic.d
... (remaining 26 argument(s) skipped):
com.google.devtools.build.lib.shell.BadExitStatusException: Process exited
with status 1.
Target //tensorflow/contrib/android:libtensorflow_inference.so failed to
build
I may need to investigate how to resolve it
After resolving it, I will try it soon, thanks
Which version of Tensorflow do you use? My Tensorflow is r1.2
Thanks for help!
2017-08-02 12:25 GMT+08:00 AinL <notifications@github.com>:
… Hi @hgffly <https://github.com/hgffly> ,
This is how to build tensorflow C lib for android. I hope this will help
you.
1. Make tensorflow for android support DT_BOOL.
Edit following line
<https://github.com/tensorflow/tensorflow/blob/v1.1.0-rc2/tensorflow/core/framework/register_types.h#L125> #define
TF_CALL_bool(m) to #define TF_CALL_bool(m) m(bool)
Source: stackoverflow
<https://stackoverflow.com/questions/40855271/no-opkernel-was-registered-to-support-op-switch-with-these-attrs-on-ios/43627334#43627334>
2. Change build option to appear TF_* functions in
libtensorflow_inference.so.
This file
<https://github.com/tensorflow/tensorflow/blob/v1.1.0-rc2/tensorflow/contrib/android/jni/version_script.lds>
defines which functions should be export. Add a line TF_* in global
definition
VERS_1.0 {
global:
Java_*;
JNI_OnLoad;
JNI_OnUnload;
TF_*;
local:
*;
};
1. Build a libtensorflow_inference.so. You should build several time
to support platforms (arm64-v8a, armeabi-v7a, x86, x86_64)
Enter the command on the root of tensorflow dir.
bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so \
--crosstool_top=//external:android/crosstool \
***@***.***_tools//tools/cpp:toolchain \
--cpu=armeabi-v7a
- To change target platform, you can just change --cpu=armeabi-v7a.
- If you ran out of memory while building, add --verbose_failures
--local_resources 4096,4.0,1.0 -j 1
1. So now, you can find tensorflow_inference.so in
bazel-bin/tensorflow/contrib/android. Copy libtensorflow_inference.so
into other place, go back to step 3, change cpu target, and build it again
for several platforms.
2. After building whole binary for each platform, rename
libtensorflow_inference.so to libtensorflow.so. And then copy those
files into correct place
(ex: TensorflowSharp/GitIgnoredData/android/armeabi-v7a/)
My solution is not official and not clear too, if you guys have some
better solution for building native library for android, please share how
to make it :)
Thanks
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#3 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ATOuLfJABWK8pd35Avra5WxctloOHN39ks5sT_o4gaJpZM4OP5YF>
.
|
Hi, @hgffly , I used tensorflow 1.2.0-r0 for build a library. |
Hi gmlwns2000,
It's fine, you have already helped a lot, thanks so much!
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
不含病毒。www.avg.com
<http://www.avg.com/email-signature?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=webmail>
<#DAB4FAD8-2DD7-40BB-A1B8-4E2AA1F9FDF2>
2017-08-03 19:18 GMT+08:00 AinL <notifications@github.com>:
… Hi, @hgffly <https://github.com/hgffly> ,
I used tensorflow 1.2.0-r0 for build a library.
I'm sorry but I am not good at bazel building, so I can't help you :(
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#3 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ATOuLXlhP49eyPaBuHmPOdfOi0oXbLlHks5sUayYgaJpZM4OP5YF>
.
|
Hey guys,
I have the following error by running a network on android:
Unhandled Exception:TensorFlow.TFException: No OpKernel was registered to support Op 'Switch' with these attrs. Registered devices: [CPU], Registered kernels:device='GPU'; T in [DT_STRING]device='GPU'; T in [DT_BOOL]device='GPU'; T in [DT_INT32]device='GPU'; T in [DT_FLOAT]device='CPU'; T in [DT_FLOAT]device='CPU'; T in [DT_INT32][[Node: model/comb1/comb1/cond/Switch = Switch[T=DT_BOOL](ph/training_ph, ph/training_ph)]] occurred
It looks like the type bool is not support on the cpu version of tensorflow.
I found this fix: tensorflow/models#1740
but wasnt able to manage it....
The text was updated successfully, but these errors were encountered: