Build libllm_inference_engine_jni.so for using GPU?
Hozzu opened this issue · comments
Philkyue Shin commented
OS Platform and Distribution
Linux Ubuntu 22.04
Compiler version
No response
Programming Language and version
Bazel
Installed using virtualenv? pip? Conda?(if python)
No response
MediaPipe version
0.10.14
Bazel version
6.1.1
XCode and Tulsi versions (if iOS)
No response
Android SDK and NDK versions (if android)
No response
Android AAR (if android)
None
OpenCV version (if running on desktop)
No response
Describe the problem
libllm_inference_engine_jni.so seems only work on CPU
Complete Logs
I am trying to build libllm_inference_engine_jni.so for using GPU.
However, the default build option seems to build a library that only works on CPU.
(bazel build -c opt --config=android_arm64 mediapipe/tasks/java/com/google/mediapipe/tasks/genai:libllm_inference_engine_jni.so)
I saw the bazel build code and found that ENABLE_ODML_MAVEN_BUILD is needed to build the library for using GPU.
(bazel build -c opt --config=android_arm64 --define ENABLE_ODML_MAVEN_BUILD=1 mediapipe/tasks/java/com/google/mediapipe/tasks/genai:libllm_inference_engine_jni.so)
However, when I use this option when building the library, it says that @odml is not found.
ERROR: /home/rtos/Downloads/mediapipe/mediapipe/tasks/java/com/google/mediapipe/tasks/core/jni/BUILD:62:11: no such package '@odml//odml/infra/genai/inference': The repository '@odml' could not be resolved: Repository '@odml' is not defined and referenced by '//mediapipe/tasks/java/com/google/mediapipe/tasks/core/jni:llm'
kuaashish commented
github-actions commented
This issue has been marked stale because it has no recent activity since 7 days. It will be closed if no further activity occurs. Thank you.
github-actions commented
This issue was closed due to lack of activity after being marked stale for past 7 days.