Converted Gemma 2b tflite generate same token #5258
Labels
platform:android
Issues with Android as Platform
platform:python
MediaPipe Python issues
stat:awaiting response
Waiting for user response
task:LLM inference
Issues related to MediaPipe LLM Inference Gen AI setup
type:support
General questions
Have I written custom code (as opposed to using a stock example script provided in MediaPipe)
No
OS Platform and Distribution
Android 11
MediaPipe Tasks SDK version
0.10.11
Task name (e.g. Image classification, Gesture recognition etc.)
llminference
Programming Language and version (e.g. C++, Python, Java)
python
Describe the actual behavior
convert model.tflite and push that to /data/local/tmp/llm, while llminference output same token
Describe the expected behaviour
after converted, llminference should output dialog
Standalone code/steps you may have used to try to get what you need
Other info / Complete Logs
The text was updated successfully, but these errors were encountered: