Giter Club home page Giter Club logo

Comments (10)

nathanlem1 avatar nathanlem1 commented on August 16, 2024

Can you put the procedures you use to build using ViSP as a backend, pls?

from easy_handeye.

marcoesposito1988 avatar marcoesposito1988 commented on August 16, 2024

thanks a lot @raultron! always nice and encouraging to hear good feedback.

from easy_handeye.

marcoesposito1988 avatar marcoesposito1988 commented on August 16, 2024

@nathanlem1: it would be great if you could create a new issue for each topic, to keep the repo organised.

I am using the ros-visp wrapper, which offers ROS interfaces (topics and services) for the ViSP library. In particular, I am using the hand2eye service.

The wrapper can be installed via apt in Ubuntu, as it is automatically packaged by the ROS community:
sudo apt-get install ros-lunar-vision-visp
After installing, you can start a roscore and the calibration service, as described in the docs.

from easy_handeye.

Sinchiguano avatar Sinchiguano commented on August 16, 2024

Not really an Issue, just wanted to thank you. This package helped me with a crucial calibration that I needed on a tight schedule.

Hi, could you please let me know in what way you did the calibration? I am still fighting to get good result, every time that I compute the calibration with a new example I got different results. And all of them are wrong results.

I am performing the eye in hand calibration with a RealSense d435 camera and a UR10 robot.

Thanks in advance

from easy_handeye.

raultron avatar raultron commented on August 16, 2024

Hi there,

My calibration setup was somehow specific. I have a camera with some spherical markers placed on top of it which are then detected by an Optitrack system with millimeter accuracy. I wanted to have the transform between camera coordinates and spherical markers coordinates.

For this case the camera has to move, and the Aruco marker is fixed in space, I had to setup the topics in the correct way for easy_handeye with a launch file. I placed the Aruco marker on a table looking at the camera and the camera on a tripod and moved it around changing both relative orientation and distance, more than 20 measurements. I was careful that the setup was completely static in each measurement and I tried to cover evenly the working space volume. Easy_handeye gave me then the transform between camera and spherical markers coordinates.

I also performed an additional calibration.

I placed additional spherical markers on the Aruco paper and then I needed the transform from spherical optritrack coordinates to the center of the Aruco marker. For that case I changed the topics with a launch file so now the camera is the one fixed and the Aruco marker is the one that moves.

With those two calibrations I could obtain ground truth measurements for my camera to Aruco marker detection, all the measurements where consistent with what I was measuring so the calibrations were correct.

from easy_handeye.

Sinchiguano avatar Sinchiguano commented on August 16, 2024

Thank you very much for your detailed answer, I appreciate it. I hope I can make it and then I will share the experience.

Have a good one.
Cesar

from easy_handeye.

marcoesposito1988 avatar marcoesposito1988 commented on August 16, 2024

@Sinchiguano, did you manage to solve your issues? Off the top of my head, they may be due to a couple of factors, including:

  • the marker has a different size than what is passed into the aruco configuration (printers love to scale down PDFs, so every time I check with a ruler and have to throw away a couple of printouts)
  • you sample the transforms just after the robot has stopped moving, and the tf frames have not reached the final position yet because of some lag
  • you have outliers in your sampled transforms, for example because in some case the marker was not detected at all, but the old position was lingering around in tf
  • the tracking is not stable because the marker appears too small in the image, or it is parallel to the image plane of the camera (in this case the detection can be very unstable)
  • the calibration of the camera is totally bananas (uncommon with RGB-D cameras since they typically come with a decent factory calibration that is picked up by the ROS driver, but definitely possible with RGB cameras if you didn't do the calibration yourself)

from easy_handeye.

Sinchiguano avatar Sinchiguano commented on August 16, 2024

Hey,

Thanks a lot for your detailed explanation. It is really helpful. Everything that you detailed, makes absolute sense since I saw that I was doing most of the stuff wrongly or with less care. I now understand what I was doing wrong according to your advice.

I think with your tricks, the calibration is gonna work.

Best regards
Cesar

from easy_handeye.

Sinchiguano avatar Sinchiguano commented on August 16, 2024

Hey, I just wanted to thank you for this repository, finally I succeed on the robot-camera calibration. I did the two types of calibration, eye-in-hand and eye-on-base calibration without any problem, but without your tips I would not make it.
Best regards
Cesar

from easy_handeye.

wangqingyu985 avatar wangqingyu985 commented on August 16, 2024

Kindly thanks to the authors of this excellent repo!

from easy_handeye.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.