Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

UnsatisfiedLinkError: No implementation found for void org.tensorflow.lite.NativeInterpreterWrapper.allowBufferHandleOutput(long, boolean) #28335

Closed
alexcohn opened this issue May 2, 2019 · 0 comments
Assignees
Labels
comp:lite TF Lite related issues type:support Support issues

Comments

@alexcohn
Copy link

alexcohn commented May 2, 2019

System information

  • OS Platform and Distribution: Linux 4.4.0-17134-Microsoft swig command not found #706-Microsoft x86_64 GNU/Linux (Ubuntu 1804)
  • Mobile device: Android, arm64
  • TensorFlow installed from (source or binary): source and binary
  • TensorFlow version: 1.13.1
  • Bazel version: 0.24.1
  • GCC/Compiler version: NDK r18, clang

Describe the problem
Using the 'official' implementation 'org.tensorflow:tensorflow-lite:0.0.0-nightly', I tried

opt.setAllowBufferHandleOutput(true)

I get

E/aoyi.run.tflit: No implementation found for void org.tensorflow.lite.NativeInterpreterWrapper.allowBufferHandleOutput(long, boolean) (tried Java_org_tensorflow_lite_NativeInterpreterWrapper_allowBufferHandleOutput and Java_org_tensorflow_lite_NativeInterpreterWrapper_allowBufferHandleOutput__JZ)

Same error reproduced when I build the libtensorflowlite_jni.so locally with bazel.

Provide the exact sequence of commands / steps that you executed before running into the problem

To verify that this was a build issue, I run

> nm -D jniLibs/arm64-v8a/libtensorflowlite_jni.so | grep allow
0000000000010940 T Java_org_tensorflow_lite_NativeInterpreterWrapper_allowFp16PrecisionForFp32

Analysis

The root cause is that the build relies on nativeinterpreterwrapper_jni.h to declare the JNI functions as extern "C". Java_org_tensorflow_lite_NativeInterpreterWrapper_allowBufferHandleOutput was introduced in 18 Dec 2018 commit which did not update the .h file.

Proposed fix

The easy fix (tested here) is to tag the exported JNI functions as extern "C" in the .cc file. Relying on the header file is not necessary and (as we witness) error-prone.

alexcohn added a commit to alexcohn/tensorflow that referenced this issue May 2, 2019
fixing [UnsatisfiedLinkError: No implementation found for void org.tensorflow.lite.NativeInterpreterWrapper.allowBufferHandleOutput(long, boolean)](tensorflow#28335).
@achandraa achandraa self-assigned this May 3, 2019
@achandraa achandraa added comp:lite TF Lite related issues type:support Support issues labels May 3, 2019
@alanchiao alanchiao assigned jdduke and alanchiao and unassigned jdduke and alanchiao May 5, 2019
jdduke added a commit to jdduke/tensorflow that referenced this issue May 7, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
comp:lite TF Lite related issues type:support Support issues
Projects
None yet
Development

No branches or pull requests

4 participants