Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

<uses-native-library> missing in Android LLM Inference Guide #5326

Open
bringert opened this issue Apr 18, 2024 · 3 comments
Open

<uses-native-library> missing in Android LLM Inference Guide #5326

bringert opened this issue Apr 18, 2024 · 3 comments
Assignees
Labels
platform:android Issues with Android as Platform stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:docs-bug Doc issues for any errors or broken links or rendering

Comments

@bringert
Copy link

bringert commented Apr 18, 2024

Description of issue (what needs changing)

Add manifest tags <uses-native-library> to Android LLM Inference Guide

Clear description

The Android LLM Inference Guide https://developers.google.com/mediapipe/solutions/genai/llm_inference/android doesn't mention the need to add <uses-native-library> to AndroidManifest.xml.

Those tags are present in the example code https://github.com/googlesamples/mediapipe/blob/main/examples/llm_inference/android/app/src/main/AndroidManifest.xml:

        <!-- Required to initialize the LlmInference -->
        <uses-native-library
            android:name="libOpenCL.so"
            android:required="false"/>
        <uses-native-library android:name="libOpenCL-car.so" android:required="false"/>
        <uses-native-library android:name="libOpenCL-pixel.so" android:required="false"/>

Without them, I get a SIGABRT with these messages:

E0000 00:00:1713429439.990396    7017 calculator_graph.cc:887] FAILED_PRECONDITION: CalculatorGraph::Run() failed: Calculator::Open() for node "LlmGpuCalculator" failed: Can not open OpenCL library on this device - undefined symbol: clSetPerfHintQCOM
F0000 00:00:1713429439.990602    7017 llm_engine.cc:126] Check failed: graph_->WaitUntilIdle() is OK (FAILED_PRECONDITION: CalculatorGraph::Run() failed: Calculator::Open() for node "LlmGpuCalculator" failed: Can not open OpenCL library on this device - undefined symbol: clSetPerfHintQCOM) 
terminating.
F0000 00:00:1713429439.990602    7017 llm_engine.cc:126] Check failed: graph_->WaitUntilIdle() is OK (FAILED_PRECONDITION: CalculatorGraph::Run() failed: Calculator::Open() for node "LlmGpuCalculator" failed: Can not open OpenCL library on this device - undefined symbol: clSetPerfHintQCOM) 
terminating. 

Correct links

No response

Parameters defined

No response

Returns defined

No response

Raises listed and defined

No response

Usage example

No response

Request visuals, if applicable

No response

Submit a pull request?

No response

@kuaashish kuaashish assigned kuaashish and unassigned ayushgdev Apr 19, 2024
@kuaashish kuaashish added task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup platform:android Issues with Android as Platform type:docs-bug Doc issues for any errors or broken links or rendering labels Apr 19, 2024
@kuaashish
Copy link
Collaborator

Hi @bringert,

Thank you for bringing this to our attention. We are forwarding the issue internally and will work to include it in our documentation soon.

Thank you!!

@kuaashish
Copy link
Collaborator

Hi @schmidt-sebastian,

Could you please look into this issue?

Thank you!!

@kuaashish kuaashish added the stat:awaiting googler Waiting for Google Engineer's Response label Apr 19, 2024
@schmidt-sebastian
Copy link
Collaborator

cc @ktonthat

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:android Issues with Android as Platform stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:docs-bug Doc issues for any errors or broken links or rendering
Projects
None yet
Development

No branches or pull requests

4 participants