Giter Club home page Giter Club logo

realtimehand's Introduction

Realtime Hand

Unity AR package to track your hand in realtime!

As seen on "Let's All Be Wizards!" : https://apps.apple.com/app/id1609685010

Features

  • 60 FPS hand detection
  • 3D Bones world detection

Sample Videos

SampleSmall.mov
SampleVideoLightning.mov

Requirements

  • Unity 2020.3 LTS
  • ARFoundation
  • iPhone with Lidar support (iPhone 12+ Pro)

Installation

  • Add the package RealtimeHand to your manifest
  • Add the SwiftSupportpackage to enable swift development
  • Check the RealtimeHandSample for usage

Classes

RTHand.Joint

  • screenPos: 2D position in normalized screen coordinates;
  • texturePos: 2D position in normalized CPU image coordinates;
  • worldPos: 3D position in worldpsace
  • name: name of the joint, matching the native one
  • distance: distance from camera in meter
  • isVisible: if the joint has been identified (from the native pose detection)
  • confidence: confidence of the detection

RTHand.RealtimeHandManager

Do most of the heavy work for you : just add it to your project, and subscribe to the ``HandUpdated` event to be notified when a hand pose has been detected

Steps:

  • Create a GameObject
  • Add the RealtimeHandManagercomponent
  • Configure it with the ARSession, ARCameraManager, AROcclusionManagerobjects
  • Subscribe to Action<RealtimeHand> HandUpdated;to be notified

OcclusionManager must be configured with temporalSmoothing=Off and mode=fastestfor optimal result

RTHand.RealtimeHand

If you want to have a full control on the flow, you can manually intialize and call the hand detection process : more work, but more control.

Properties

  • IsInitialized : to check if the object has been properly initialized (ie: the ARSession has been retrieved)
  • IsVisivle: to know if the hand is currently visible or not
  • Joints: dictionary of the all the joints

Functions

  • Initialize(ARSession _session, ARCameraManager _arCameraManager, Matrix4x4 _unityDisplayMatrix) : initialize the object with the required components

The session must be in tracking mode

  • Dispose() : release the component and it resources
  • Process( CPUEnvironmentDepth _environmentDepth, CPUHumanStencil _humanStencil ) : launch the detection method using the depth buffers

Check the RealtimeHandManageras an example

Under the hood

When a camera frame is received :

  • Execute synchronously VNDetectHumanHandPoseRequest to retrieve a 2D pose estimation from the OS
  • Retrieve the environmentDepth and 'humanStencil CPU images
  • From the 2D position of each bone, extract its 3D distance using the depth images to reconstruct a 3D position

References

Revisions

  • Fix compatibility with Unity 2020.3
  • Added Lightning Shader & effects
  • Initial Release

realtimehand's People

Contributors

ogoguel avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

realtimehand's Issues

Suggestion

I did a lot of work with 3D hand reconstruction in ARHeadsetKit. More specifically, I solved major problems with power consumption. Would it be helpful if I showed you what I did?

Request for Guidance on Simulating Reflections in Unity AR Hand-Tracking Project

Good afternoon,

Thank you once again for your previous response; I truly appreciate your help. I successfully implemented a hand-tracking system for two hands, though I used a different solution. However, your repository was incredibly valuable, especially for calculating depth using GetHumanDistanceFromEnvironment.

My next challenge is simulating reflections on the hand. Simply casting a light near the hand doesn't enhance the experience. I followed your breakdown on estimating normals and managed to estimate the normal of my hand, but I realized that my approach was essentially post-processing and that the estimated normal isn't usable.

Could you provide guidance or a reference on how to simulate reflections on the hand effectively?
Here is this normal I estimated:

Uploading IMG_1828.png…

Thank you for your time and assistance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.