Giter Club home page Giter Club logo

Comments (9)

Rak-r avatar Rak-r commented on September 2, 2024

Update:

I just turned off everything and tested with just rtab's odometry with same params as above and rtab' slam. No EKF and No odometry from sensor.

I again did that sharp turn and this time robot appears to behave what I mentioned above. ( It is turning instead of sliding sideways which it was doing previously). This means that with my sensor and steering angle fused wrongly with ekf. @matlabbe could you provide some insight on this based on above params. I can say that odometry from my sensor is good for translation. I just need to handle the case where rtab's VO is lost and for that just using steering angle fused in ekf and made EKF not to believe on steering much.

Is this something to do on how I am fusing odmetry data from sensor mounted on the base_link for which robot_state_publihser publishes the static transform and also for the camera.

from rtabmap_ros.

Rak-r avatar Rak-r commented on September 2, 2024

I tried to fuse my sensor's translation only (pose.x and pose.y) and use rtab's rgbd-odometry Vyaw. Fed this setup to the EKF and tried the same above mentioned test. At the start, it seems okish , the pose.x values from my sensor and from rgbd odometry has difference of (0.4m-0.5m), when I drive straight.

But as soon as I try to make the U-turn, the robot facing is correct this time but there is huge difference between EKF'S estimated pose and rgbd's odometry. EKF saying 2.4, 2.5, something values while the rgbd odometry reports 0.24, 0.26 for pose.x.

I believe I am getting confused with the setup and EKF is making things wierd.

Also, in rgbd odometry, the parameter odom_frame_id was default set to odom and I removedit and let it blank. But still nothing changed. Need advice on this wierd behaviour. @matlabbe

from rtabmap_ros.

Rak-r avatar Rak-r commented on September 2, 2024

Update:

I played with my sensor output and came to know that y values of my sensor do not behave well with REP103, even after negating it. For now, I just passed the pose.x from my sensor data and absolute pose.x pose.y and orientation.yaw from RGBD odometry together in the EKF. Now, I performed the same U turn, and the side drift is gone, a bit more tweaking required for my sensor data and VO to output nearly same measurements for pose.x.

If somebody is struggling with the same setup of fusion with EKF, just follow below steps.

  1. Firstly check the Rtabmap slam with just the RTAB'S RGBD oodmetry and verify it behaves correct for indoors.

  2. Now, if fusing the data from other sensors and RGBD odometry, turn of the transform publisher parameter in RGBD odometry. publish_tf : False for odom->base_link

  3. Make sure that your sensor node / wheel odmetry node not publishing tf from odom->base_link.

  4. Add the odom topics or whatever your node outputs, to the EKF and use absolute poses from the one which you are more confident and use velocities from the one which is less stable. Example: VO gets lost with white walls , at that time just need other source to handle the cases.

  5. Then, in RGBD odometry launch file, set the publish_null_when_lost : False and 'Odom/ResetCountdown': '1' to immediately reset and use the other source for measurement.

  6. Run rviz/rviz2 and visualize its behaving as similar as the case with just RGBD odometry in global frame set to odom, small difference will be fine.

  7. Now, in the RTABMAP's Slam launch file, remap the topic odom to EKF's output topic usually odometry/filtered and start the slam, change the global frame to map, if want test your ways or try as the one mentioned in my issue.

  8. There might be need to tweak EKF's process noise covariances.

The other way is to just use RGBD odometry and using the guess_frame_id parameter which behaves similar to handle the lost situation.

  1. Set frame id to vo if want to use RGBD oodmetry as like as ekf and set guess_frame_id to odom & keep publishing odom -> base_link from your sensor or wheel odometry

  2. This will not break the tf tree and new setup would be like vo -> odom -> base_link.

  3. When RTABMAP's Slam is launched the tf tree would look like map -> vo -> odom -> base_link.

from rtabmap_ros.

matlabbe avatar matlabbe commented on September 2, 2024

I think the optical flow odometry and steering angle is wrongly set in their respective odometry topic. Is it really giving pose.x and pose.y globally or relative to base frame? You may want to set the delta value as a velocity in base frame instead of pose. For the steering angle, same thing, the steering angle is independent of the global yaw, you would have to transform it as a yaw velocity instead of a yaw orientation.

from rtabmap_ros.

Rak-r avatar Rak-r commented on September 2, 2024

The optic flow sensor is static to base link and robot state publisher do the static transform between optical flow frame and base link as parent frame.

The odometry message which i generate, I set frame id as odom and child frame id as base link.

I also feed steering angle in the same odometry message in the yaw field. Then this message is fed to the EKF to handle some uncertainity. @matlabbe

I tried with just pose.x from this odometry message and absolute pose from RGBD odometry to EKF. Is it wrong? I also tested with velocity in x from my this sensor odometry and absolute pose from RGBD, they seem to have different values but the side drifting is gone. Now the values increase/ decrease in same direction.

from rtabmap_ros.

Rak-r avatar Rak-r commented on September 2, 2024

By pose.x and pose.y globally means ? The message is in odom frame and rtab's slam does the correction in map frame over the topic / localisation_pose.

I also checked my sensor's y values it is sometimes while turning right reports positive values which is wrong as per the REP105. So I turned off the y from my sensor and using RGBD's absolute pose info. @matlabbe

from rtabmap_ros.

matlabbe avatar matlabbe commented on September 2, 2024

By globally, I meant in reference to same fixed frame (e.g., odom). Independently of the orientation of the robot, x-y values refer to same fixed frame. For example, if robot moves forward 1 meter (x=1,y=0), turn 90 deg clockwise and move another 1 meter forward, the resulting pose should be x=1,y=1, not x=2, y=0. If your optical flow sensor gives x-y delta value from previous pose, transform them as velocity and fuse velocity instead. For the steering angle, if it says "+30 deg", it is relative to base frame, not odom frame, so it cannot be used "as is" in pose of odometry message. It could be translated to a velocity and use the twist in odometry message instead and fuse vyaw.

In definition of the odometry message, you will see that pose is in frame_id and twist is in child_frame_id.

from rtabmap_ros.

Rak-r avatar Rak-r commented on September 2, 2024

Ok. What I am doing is taking those dx and dy and feeding the absolute pose to odometry.pose.x and pose.y by this abs_x += dx and abs_y += dy and also I calculate the velocity using vx = dx/dt and vy = dy/dt .

For the steering angle, I find the steering angle from my feedback messages from the linear actuator and then using:

Vyaw = Vx*tan(current_angle)/wheelbase
delta_theta = Vyaw*delta_time
Absolute_theta +=delta_theta

Then generate the odometry message:

odom_msg.header.stamp = self.get_clock().now().to_msg()
 odom_msg.header.frame_id = 'odom'
 odom_msg.child_frame_id = 'base_link'
 odom_msg.pose.pose.position.x = abs_x
  odom_msg.pose.pose.position.y = abs_y
odom_msg.twist.twist.linear.x = vx
odom_msg.twist.twist.angular.z = vyaw

The vx used to find angular yaw is the same which the optical flow reports from the above calculation and what you just mentioned. I am getting confused with frame settings. Do I need to set optical flow frame as child frame in odom message with parent frame as odom, so that absolute pose data represented and in odom frame with respect to optical flow frame. and but the twist data is represented in optical flow frame instead of base link. Correct me if I am getting something wrong here or missing something.
@matlabbe

from rtabmap_ros.

matlabbe avatar matlabbe commented on September 2, 2024

It should be base_link frame (ideally same frame than rgbd_odometry is using).

I would assume vx is in base frame, you cannot just do abs_x += dx, because abs_x should be in odom frame. Otherwise, dx should be transformed in global orientation in odom frame before summing them together. For robot_localization, you cannot fuse pose of two odometry messages (see http://docs.ros.org/en/melodic/api/robot_localization/html/preparing_sensor_data.html#odometry), you should convert them as differential, which implicitly convert poses to velocity and fuse the velocities in base frame. You can then fuse the twist of your wheel odometry directly instead. You may have to set differential for rgbd odometry in case it is lost or use only its twist.

from rtabmap_ros.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.