This fork serves as a sample on how to receive Bitmap image from graph for additional postprocessing (google-ai-edge#831). It works with android example app https://github.com/Cubbee/mediapipe_multi_hands_tracking_aar_example
- From the mediapipe side, you need to add GpuBufferToImageFrameCalculator to the graph somewhere, for example
mediapipe/graphs/hand_tracking/hand_tracking_mobile.pbtxt
node {
calculator: "GpuBufferToImageFrameCalculator"
input_stream: "throttled_input_video"
output_stream: "throttled_input_video_cpu"
}
- Add this calculator to needed graph BUILD file
mediapipe/graphs/hand_tracking/BUILD
cc_library(
name = "mobile_calculators",
deps = [
"//mediapipe/calculators/core:constant_side_packet_calculator",
"//mediapipe/calculators/core:flow_limiter_calculator",
"//mediapipe/graphs/hand_tracking/subgraphs:hand_renderer_gpu",
"//mediapipe/modules/hand_landmark:hand_landmark_tracking_gpu",
"//mediapipe/gpu:gpu_buffer_to_image_frame_calculator",
],
)
- Listen to the packet and decode it to bitmap
processor.addPacketCallback(
"transformed_image_cpu"
) { packet ->
println("Received image with ts: ${packet.timestamp}")
val image = AndroidPacketGetter.getBitmapFromRgba(packet)
}
-
Clone this repo
-
git checkout 0.8.2-bitmap-sample
-
Build and run docker image
./run_docker.sh
-
Build aar and graph inside docker
./build_aar.sh
It will create
hand_tracking_mobile_gpu.binarypb
andmp_multi_hand_aar.aar
at the repo root folder -
You need to copy
mp_multi_hand_aar.aar
toapp/libs
andhand_tracking_mobile_gpu.binarypb
toapp/src/main/assets