Comments (5)
Hi,
Thanks for trying out the dataset. The bent fingers are an issue with the tracking, as I did not use a proper skeleton size for the hand. This led the Captury software to fit the hand-keypoints with a default bone length (longer). Hence the fingers appear bent.
To resolve this problem, we have created MANO parameters that fit very close to the GT. Below is the 3D overlay of the MANO on the Event and RGB streams.
Best,
Christen
rgb_render.mp4
event_render.mp4
from ev2hands.
Oh yes, I tried plotting the joints obtained from the data provided in MANO parameters; they fit much better to the input frames compared to the joints in the .pickle files in the Ev2Hands-R dataset (obtained from Captury). Thank you for your help :)
Concerning this new information, I have a couple of questions.
-
How did you estimate these MANO parameters? Could you share the corresponding code?
-
According to the code, the joints used as targets in the code for finetuning and evaluation correspond to the ones obtained by Captury and not the ones corresponding to these MANO parameters. This is based on the following observations:
- The loss uses
forward_non_mano_data
(losses.py) asmano_gt
is set to 0 (ev2hands_r.py) - The dataset uses the pickle file provided in the Ev2Hands-R dataset (evaluation_stream.py), which appears to contain the joints obtained from Captury.
- The loss uses
Please correct me if I am missing something.
--
Pratik
from ev2hands.
Hi,
Answering to your question,
- The following procedure is used for estimating the MANO parameters,
- Computing 2D keypoints from the multi-view camera images.
- Triangluate the 2D keypoints and obtain the 3D keypoints.
- Use the 3D keypoints to perform IK and obtain the MANO parameters.
I am not planning to release the code for this procedure but it is very similar to EasyMocap
- Unfortunately, I found this issue after doing the evaluations. So, the metrics reported in the paper are evaluated with the Captury joints.
We also tentatively plan to release more data with more participants (with better joint fitting) and the camera extrinsics for the event + RGB camera.
Best,
Christen
from ev2hands.
Your answers help clear up my doubts about the dataset. Thank you!
The camera extrinsic between the Event and RGB cameras, along with the RGB camera intrinsic, would be very useful. I look forward to this release.
from ev2hands.
Hi Pratik,
The camera parameters are added along with the MANO parameters.
Best,
Christen
from ev2hands.
Related Issues (3)
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from ev2hands.