Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[LLM][falcon_cpu]Failed to run on Android Device (internal:Failed to initialize engine: %s Failed to create engine: INTERNAL:;RET_CHECK failure(mediapipe/tasks/cc/genai/inference/utils/xnn_utils/tflite_weight_accessor.cc:102)tflite_model_ #5600

Open
charles-cloud opened this issue Sep 2, 2024 · 2 comments
Labels
platform:android Issues with Android as Platform stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup

Comments

@charles-cloud
Copy link

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

None

OS Platform and Distribution

Android 12

MediaPipe Tasks SDK version

MEDIAPIPE_FULL_VERSION = "0.10.15"

Task name (e.g. Image classification, Gesture recognition etc.)

LLM

Programming Language and version (e.g. C++, Python, Java)

java

Describe the actual behavior

Failed to execute it on device

Describe the expected behaviour

It should run on device.

Standalone code/steps you may have used to try to get what you need

1. Convert the model to tflite and change the model path in Inference.kt
2. Create APK
3. Run it on device

Other info / Complete Logs

No response

@kuaashish kuaashish assigned kuaashish and unassigned ayushgdev Sep 2, 2024
@kuaashish kuaashish added platform:android Issues with Android as Platform task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions labels Sep 2, 2024
@kuaashish
Copy link
Collaborator

Hi @charles-cloud,

We believe this issue might be due to our update to SDK version 34 throughout MediaPipe and same information you can find out here in our v0.10.15 release note here https://github.com/google-ai-edge/mediapipe/releases/tag/v0.10.15. Since Android 12 uses SDK version 31, could you please try testing with Android 14, which uses SDK version 34, and let us know if you still experience the crash?

Thank you!!

@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Sep 3, 2024
@charles-cloud
Copy link
Author

Hi @kuaashish ,

Thank you so much for the information,
I have still faced the same issue on Android14 with falcon_cpu.bin TFLite model.

Could you please share the place to enable debug logs in native source code ? It will help to build libllm_inferencece_engine_jni.so and understand if its platform issue.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:android Issues with Android as Platform stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup
Projects
None yet
Development

No branches or pull requests

4 participants