jiuqiant / mediapipe_multi_hands_tracking_aar_example Goto Github PK
View Code? Open in Web Editor NEWMediaPipe multi-hand tracking gpu demo with MediaPipe's Android archive library
MediaPipe multi-hand tracking gpu demo with MediaPipe's Android archive library
inputSidePackets.put("min_detection_confidence", packetCreator.createFloat32(0.9f)); invalid
Hello, I want to compile a 3D multiple hand detection aar.
My script is like this:
load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")
mediapipe_aar(
name = "mp_multi_3d_handtracking_aar",
calculators = ["//mediapipe/graphs/hand_tracking:multi_hand_mobile_calculators"],
)
Run the command:
bazel build -c opt --fat_apk_cpu=arm64-v8a,armeabi-v7a --define 3D=true //mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_examples:mp_multi_3d_handtracking_aar
For real application, the frame still appears as 2D frame.
Is my method and flow right?
I think the default shoudl be 2D hand pose estimatin, what other steps should I do to turn on the 3D hand estimation switch?
Thanks & Regards!
Hello, firstly thank you for this example. I've cloned this project to my computer and successfully installed the app on my phone. However the fps is very low when running, just about 10fps, I don't think this is real time. Did I missing something ? My phone is SamSung A21s, 3gb Ram, GPU Mali-G52
When I first import the project into Android Studio, it came up with this error after building:
The minSdk version should not be declared in the android manifest file. You can move the version from the manifest to the defaultConfig in the build.gradle file.
I refractored by removing the minSdkVersion
and syncing the project again, and the project is able to be built and run without error. However, when I attempt to run the project in the Android Emulator, it only ever shows the message: MyMultiHandsTrackingApp keeps stopping
.
Here's the logcat:
2020-06-01 17:15:08.228 3497-3497/? W/andstrackingap: Unexpected CPU variant for X86 using defaults: x86_64
2020-06-01 17:15:08.970 3497-3497/com.example.mediapipemultihandstrackingapp I/andstrackingap: The ClassLoaderContext is a special shared library.
2020-06-01 17:15:09.409 3497-3497/com.example.mediapipemultihandstrackingapp D/Camera2Initializer: CameraX initializing with Camera2 ...
2020-06-01 17:15:09.438 3497-3497/com.example.mediapipemultihandstrackingapp I/CameraManagerGlobal: Connecting to camera service
2020-06-01 17:15:10.103 3497-3497/com.example.mediapipemultihandstrackingapp W/andstrackingap: Verification of void androidx.camera.core.AutoValue_SurfaceSizeDefinition.<init>(android.util.Size, android.util.Size, android.util.Size) took 597.673ms
2020-06-01 17:15:10.167 3497-3497/com.example.mediapipemultihandstrackingapp D/CameraRepository: Added camera: 0
2020-06-01 17:15:10.199 3497-3529/com.example.mediapipemultihandstrackingapp D/UseCaseAttachState: Active and online use case: [] for camera: 0
2020-06-01 17:15:10.201 3497-3497/com.example.mediapipemultihandstrackingapp D/CameraRepository: Added camera: 1
2020-06-01 17:15:10.207 3497-3529/com.example.mediapipemultihandstrackingapp D/UseCaseAttachState: Active and online use case: [] for camera: 1
2020-06-01 17:15:10.250 3497-3497/com.example.mediapipemultihandstrackingapp D/AndroidRuntime: Shutting down VM
2020-06-01 17:15:10.260 3497-3497/com.example.mediapipemultihandstrackingapp E/AndroidRuntime: **FATAL EXCEPTION: main
Process: com.example.mediapipemultihandstrackingapp, PID: 3497
java.lang.UnsatisfiedLinkError: dalvik.system.PathClassLoader[DexPathList[[zip file "/system/framework/org.apache.http.legacy.boot.jar", zip file "/data/app/com.example.mediapipemultihandstrackingapp-xr2Y3qGpPY5HIlyiUsVbUw==/base.apk"],nativeLibraryDirectories=[/data/app/com.example.mediapipemultihandstrackingapp-xr2Y3qGpPY5HIlyiUsVbUw==/lib/x86_64, /system/lib64]]] couldn't find "libmediapipe_jni.so"**
at java.lang.Runtime.loadLibrary0(Runtime.java:1012)
at java.lang.System.loadLibrary(System.java:1669)
at com.example.mediapipemultihandstrackingapp.MainActivity.<clinit>(MainActivity.java:38)
at java.lang.Class.newInstance(Native Method)
at android.app.AppComponentFactory.instantiateActivity(AppComponentFactory.java:69)
at androidx.core.app.CoreComponentFactory.instantiateActivity(CoreComponentFactory.java:41)
at android.app.Instrumentation.newActivity(Instrumentation.java:1215)
at android.app.ActivityThread.performLaunchActivity(ActivityThread.java:2831)
at android.app.ActivityThread.handleLaunchActivity(ActivityThread.java:3048)
at android.app.servertransaction.LaunchActivityItem.execute(LaunchActivityItem.java:78)
at android.app.servertransaction.TransactionExecutor.executeCallbacks(TransactionExecutor.java:108)
at android.app.servertransaction.TransactionExecutor.execute(TransactionExecutor.java:68)
at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1808)
at android.os.Handler.dispatchMessage(Handler.java:106)
at android.os.Looper.loop(Looper.java:193)
at android.app.ActivityThread.main(ActivityThread.java:6669)
at java.lang.reflect.Method.invoke(Native Method)
at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493)
at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)
2020-06-01 17:15:13.937 3497-3497/com.example.mediapipemultihandstrackingapp I/Process: Sending signal. PID: 3497 SIG: 9
In there is shows fatal error, couldn't find "libmediapipe_jni.so"
Any idea on what am I missing here? Thanks and regards :)
Hi, for some reason this seems to be much slower than the released APK:
https://drive.google.com/open?id=1uCjS0y0O0dTDItsMh8x2cf4-l3uHW1vE
Is this simply due to the build method? Is there any way to speed up the inference if I have to use Gradle?
Many thanks!
Hi landmarks are deducted only why it is rendered on surface view if we disable the following line
ViewGroup viewGroup = findViewById(R.id.preview_display_layout);
viewGroup.addView(previewDisplayView);
in the function setupPreviewDisplayView() then no landmarks are deducted, want landmarks without the surface view.
ref google-ai-edge/mediapipe#1360
do we have any true positive on dockerized hands smaple?
inputSidePackets.put("min_detection_confidence", packetCreator.createFloat32(0.9f)); invalid
any idea how to get hand data
The same problem goes with face-detection demo https://github.com/jiuqiant/mediapipe_face_detection_aar_example
i had follow this post : https://google.github.io/mediapipe/getting_started/android_archive_library and my aar file is about 120 mb , whereas your aar file is 52.7 MB , how can i reduce the size of my aar file(i build for only one hand tracking)
System.err: java.lang.IllegalArgumentException: com.google.protobuf.InvalidProtocolBufferException: While parsing a protocol message, the input ended unexpectedly in the middle of a field. This could mean either that the input has been truncated or that an embedded message misreported its own length.
2020-11-12 19:32:54.909 11434-11599/com.example.mediapipemultihandstrackingapp W/System.err: at com.google.mediapipe.framework.PacketGetter.getProtoVector(PacketGetter.java:152)
When I run my Android code it is wrong,the error is
JNI DETECTED ERROR IN APPLICATION: JNI CallVoidMethodV called with pending exception java.lang.NoSuchMethodError: No static method registerDefaultInstance(Ljava/lang/Class;Lcom/google/protobuf/GeneratedMessageLite;)V in class Lcom/google/protobuf/GeneratedMessageLite; or its super classes (declaration of 'com.google.protobuf.GeneratedMessageLite' appears in /data/app/com.example.mediapipehandtracking-1/base.apk) art/runtime/java_vm_ext.cc:410] at void com.google.mediapipe.formats.proto.LandmarkProto$NormalizedLandmarkList.<clinit>() (LandmarkProto.java:1684) art/runtime/java_vm_ext.cc:410] at void com.example.mediapipehandtracking.MainActivity.lambda$onCreate$0$MainActivity(com.google.mediapipe.framework.Packet) (MainActivity.java:108) art/runtime/java_vm_ext.cc:410] at void com.example.mediapipehandtracking.-$$Lambda$MainActivity$Ebp1jy5s4WkhQ-ZwIjp9HiYcpYI.process(com.google.mediapipe.framework.Packet) (lambda:-1)
hoping for your answer,thinks
Hi all
Thanks for great samples on mediapipe.
I am new to mediapipe and was trying multiple examples and especially hand tracking example. This example is great to start work and was able to successfully build and run on my Samsung A50 device. specs. This is arm arch processor device and the application works perfectly.
I am trying to run the same app on another device which is also arm arch processor Vuzix M400 specs and the same Android OS version. Both the devices seems to be on-par with the specs. The problem is in M400 device the landmarks call back for OUTPUT_LANDMARKS_STREAM_NAME is not consistent from FrameProcessor during startup. i.e. I launch the app, it starts with blank screen and no callbacks. Try to open multiple times and it would succeed sometimes and works. This is some strange behavior
I tried to debug this and also added additional consumer to ExternalTextureConverter to get callback for onNewFrame(TextureFrame frame). This is to check whether camera is sending the frames and mediapipe is able to receive the inputs. Even in this case the behavior is same that app launches with blank screen sometimes without callback and sometimes with proper callback.
Created another app with just preview of camera and that seems to work fine; without any mediapipe code/libraries to process the camera.
I did all the things typical developer does, but could not find any clue on this... any help is appreciated.
Thanks very much.
D/UseCaseAttachState: All use case: [androidx.camera.core.Preview-27cce9c0-8fa0-445c-a7e1-b01fa4accbdb265504916] for camera: 0
D/DeferrableSurface: New surface in use[total_surfaces=1, used_surfaces=1](androidx.camera.core.SurfaceRequest$2@f72883}
use count+1, useCount=1 androidx.camera.core.SurfaceRequest$2@f72883
D/CaptureSession: Opening capture session.
D/CaptureSession: Attempting to send capture request onConfigured
Issuing request for session.
D/CaptureSession: CameraCaptureSession.onConfigured() mState=OPENED
D/CaptureSession: CameraCaptureSession.onReady() OPENED
D/CameraOrientationUtil: getRelativeImageRotation: destRotationDegrees=0, sourceRotationDegrees=270, isOppositeFacing=true, result=270
I use the following command to create aar, but when i instead it into your project, it crashed when run on my phone, but your original project works well. Could you please tell me the bazel command and build file you create the aar?
the following is mine.
bazel command
bazel build -c opt --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --fat_apk_cpu=arm64-v8a,armeabi-v7a //mediapipe/examples/android/src/java/com/google/mediapipe/apps/aar_example:mp_multi_hand_aar
bazel version 2.0.0
android sdk version 28
ndk version 20.1.5948944
build file
load("//mediapipe/java/com/google/mediapipe:mediapipe_aar.bzl", "mediapipe_aar")
mediapipe_aar(
name = "mp_multi_hand_aar",
calculators = ["//mediapipe/graphs/hand_tracking:multi_hand_mobile_calculators"],
)
Hey please make another repo for 3d object detection using media pipe .
How should I get the data for palm? Such as location of palm, width, height and rotation.
I want to remove the display of those 21 landmark points while running it on phone but I am not getting it where to make changes to remove that. Can you please help me with that.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.