Comments (10)
Can you put the procedures you use to build using ViSP as a backend, pls?
from easy_handeye.
thanks a lot @raultron! always nice and encouraging to hear good feedback.
from easy_handeye.
@nathanlem1: it would be great if you could create a new issue for each topic, to keep the repo organised.
I am using the ros-visp wrapper, which offers ROS interfaces (topics and services) for the ViSP library. In particular, I am using the hand2eye service.
The wrapper can be installed via apt in Ubuntu, as it is automatically packaged by the ROS community:
sudo apt-get install ros-lunar-vision-visp
After installing, you can start a roscore and the calibration service, as described in the docs.
from easy_handeye.
Not really an Issue, just wanted to thank you. This package helped me with a crucial calibration that I needed on a tight schedule.
Hi, could you please let me know in what way you did the calibration? I am still fighting to get good result, every time that I compute the calibration with a new example I got different results. And all of them are wrong results.
I am performing the eye in hand calibration with a RealSense d435 camera and a UR10 robot.
Thanks in advance
from easy_handeye.
Hi there,
My calibration setup was somehow specific. I have a camera with some spherical markers placed on top of it which are then detected by an Optitrack system with millimeter accuracy. I wanted to have the transform between camera coordinates and spherical markers coordinates.
For this case the camera has to move, and the Aruco marker is fixed in space, I had to setup the topics in the correct way for easy_handeye with a launch file. I placed the Aruco marker on a table looking at the camera and the camera on a tripod and moved it around changing both relative orientation and distance, more than 20 measurements. I was careful that the setup was completely static in each measurement and I tried to cover evenly the working space volume. Easy_handeye gave me then the transform between camera and spherical markers coordinates.
I also performed an additional calibration.
I placed additional spherical markers on the Aruco paper and then I needed the transform from spherical optritrack coordinates to the center of the Aruco marker. For that case I changed the topics with a launch file so now the camera is the one fixed and the Aruco marker is the one that moves.
With those two calibrations I could obtain ground truth measurements for my camera to Aruco marker detection, all the measurements where consistent with what I was measuring so the calibrations were correct.
from easy_handeye.
Thank you very much for your detailed answer, I appreciate it. I hope I can make it and then I will share the experience.
Have a good one.
Cesar
from easy_handeye.
@Sinchiguano, did you manage to solve your issues? Off the top of my head, they may be due to a couple of factors, including:
- the marker has a different size than what is passed into the aruco configuration (printers love to scale down PDFs, so every time I check with a ruler and have to throw away a couple of printouts)
- you sample the transforms just after the robot has stopped moving, and the tf frames have not reached the final position yet because of some lag
- you have outliers in your sampled transforms, for example because in some case the marker was not detected at all, but the old position was lingering around in tf
- the tracking is not stable because the marker appears too small in the image, or it is parallel to the image plane of the camera (in this case the detection can be very unstable)
- the calibration of the camera is totally bananas (uncommon with RGB-D cameras since they typically come with a decent factory calibration that is picked up by the ROS driver, but definitely possible with RGB cameras if you didn't do the calibration yourself)
from easy_handeye.
Hey,
Thanks a lot for your detailed explanation. It is really helpful. Everything that you detailed, makes absolute sense since I saw that I was doing most of the stuff wrongly or with less care. I now understand what I was doing wrong according to your advice.
I think with your tricks, the calibration is gonna work.
Best regards
Cesar
from easy_handeye.
Hey, I just wanted to thank you for this repository, finally I succeed on the robot-camera calibration. I did the two types of calibration, eye-in-hand and eye-on-base calibration without any problem, but without your tips I would not make it.
Best regards
Cesar
from easy_handeye.
Kindly thanks to the authors of this excellent repo!
from easy_handeye.
Related Issues (20)
- The GUI viewer is empty when I launch the calibration (thus cannot collect samples). HOT 3
- Cann't find check_calibration.launch file. How to check the calibration result? HOT 1
- DepthCloud's position wrong with the calibrated camera frame HOT 1
- pip2 + transforms3d ??? HOT 1
- Wrong visualisation of point cloud after calibration HOT 6
- No transform from [base] to frame [world] HOT 1
- Error when takes sample HOT 2
- Maximum achievable accuracy of the hand eye calibration tool HOT 1
- ImportError: No module named transforms3d
- I am confused because of other online articles introducing easy_hand. HOT 2
- results of each calibration are very different HOT 1
- Trouble while using the calibration matrices HOT 4
- something wrong when using easy_handeye calibrating
- Number of samples
- Load rqt_easy_handeye plugin failed HOT 6
- All functions are running normally, except for the lack of Plugins and the inability of rviz to display the coordinate system.
- rqt_calibrationmovements `Plan` failed
- Take sample error
- calibration
- Pick and place calibration
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from easy_handeye.