Giter Club home page Giter Club logo

ego-planner-v2's People

Contributors

bigsuperzzzx avatar zhepeiwang avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

ego-planner-v2's Issues

Sub-optimal paths when planning around obstacles

I noticed that a sub optimal path is generated when planning around obstacles. Instead of simply going around the obstacle (and staying in the same horizontal plane), the planner tends to plan up and down while curving around the obstacle. Is there a weight that can be tuned to reduce the extent of this behaviour?

yaw and yaw rate are same

Hi @bigsuperZZZX , why does the planner publish the equal Yaw and Yaw rate?
yaw: 0.563052045148
yaw_dot: 0.563052045148

in traj_server.cpp, In function calculate_yaw()
yaw_yawdot.second = yaw_temp;

makes them equal.
why is this required?

thanks

the drone_detect package seems not work

Hi @bigsuperZZZX
i am using these project in gazebo. but when i use the drone_detect package ,the new_depth topic can not erase hit pixels in depth. i config file are follows. I configured the code according to the readme.
drone1
drone

two config file
cam_width: 640 cam_height: 480 cam_fx: 554.254691191187 cam_fy: 554.254691191187 cam_cx: 320.5 cam_cy: 240.5

pixel_ratio: 0.1 estimate/max_pose_error: 0.4 estimate/drone_width: 0.5 estimate/drone_height: 0.2

source file
`
DroneDetector::DroneDetector(ros::NodeHandle& nodeHandle)
: nh_(nodeHandle)
{
readParameters();

// depth_img_sub_.reset(new message_filters::Subscriber<sensor_msgs::Image>(nh_, "depth", 50, ros::TransportHints().tcpNoDelay()));
// colordepth_img_sub_.reset(new message_filters::Subscriber<sensor_msgs::Image>(nh_, "colordepth", 50));
// camera_pos_sub_.reset(new message_filters::Subscriber<geometry_msgs::PoseStamped>(nh_, "camera_pose", 50));

my_odom_sub_ = nh_.subscribe("odometry", 100, &DroneDetector::rcvMyOdomCallback, this, ros::TransportHints().tcpNoDelay());
depth_img_sub_ = nh_.subscribe("depth", 50, &DroneDetector::rcvDepthImgCallback, this, ros::TransportHints().tcpNoDelay());
// sync_depth_color_img_pose_->registerCallback(boost::bind(&DroneDetector::rcvDepthColorCamPoseCallback, this, _1, _2, _3));

drone0_odom_sub_ = nh_.subscribe("drone0", 50, &DroneDetector::rcvDrone0OdomCallback, this);
drone1_odom_sub_ = nh_.subscribe("drone1", 50, &DroneDetector::rcvDrone1OdomCallback, this);
drone2_odom_sub_ = nh_.subscribe("drone2", 50, &DroneDetector::rcvDrone2OdomCallback, this);
drone3_odom_sub_ = nh_.subscribe("drone3", 50, &DroneDetector::rcvDrone3OdomCallback, this);
// droneX_odom_sub_ = nh_.subscribe("others_odom", 50, &DroneDetector::rcvDroneXOdomCallback, this, ros::TransportHints().tcpNoDelay());

new_depth_img_pub_ = nh_.advertise<sensor_msgs::Image>("new_depth_image", 50);
debug_depth_img_pub_ = nh_.advertise<sensor_msgs::Image>("debug_depth_image", 50);

debug_info_pub_ = nh_.advertise<std_msgs::String>("/debug_info", 50);

cam2body_ <<
0.0, 0.0, 1.0, 0.0,
-1.0, 0.0, 0.0, 0.0,
0.0, -1.0, 0.0, 0.0,
0.0, 0.0, 0.0, 1.0;
// cam2body_ <<
// 1.0, 0.0, 1.0, -0.25,
// 0.0, 1.0, 0.0, 0.0,
// 0.0, 0.0, 1.0, 0.0,
// 0.0, 0.0, 0.0, 1.0;

// init drone_pose_err_pub
for(int i = 0; i < max_drone_num_; i++) {
if(i != my_id_)
drone_pose_err_pub_[i] = nh_.advertise<geometry_msgs::PoseStamped>("drone"+std::to_string(i)+"to"+std::to_string(my_id_)+"_pose_err", 50);
}

ROS_INFO("Successfully launched node.");
}
`

launch file
`




    <remap from="~odometry"   to="iris_$(arg drone_id)/$(arg odom_topic)"/>
    <remap from="~depth" to="iris_$(arg drone_id)/realsense/depth_camera/depth/image_raw"/>
    <remap from="~new_depth_image" to="iris_$(arg drone_id)/new_depth_image"/>
    <remap from="~debug_depth_image" to="iris_$(arg drone_id)/debug_depth_image"/>
    <!-- <remap from="~colordepth" to="iris_$(arg drone_id)/realsense/depth_camera/color/image_raw"/>
    <remap from="~camera_pose" to="null"/> -->

    <remap from="~drone0" to="/iris_0/$(arg odom_topic)"/>
    <remap from="~drone1" to="/iris_1/$(arg odom_topic)"/>
    <remap from="~drone2" to="/iris_2/$(arg odom_topic)"/>
    <remap from="~drone3" to="/iris_3/$(arg odom_topic)"/>
`

occupancy grid not updating

Hi @bigsuperZZZX,
I have one question. I tested ego-planner-v2 with point cloud data in the real world. after running the planner for some time, the occupancy grid is not updated, it says the drone is in a obstacle and can be visualised in Rviz, but in real world, there are no obstacles, and if I restart the planner, it works, and the grid is updated. below I've attached a picture. can you please tell me, the probable cause for this problem?
Screenshot from 2022-07-18 16-37-29
Screenshot from 2022-07-18 18-10-15

software of FMU

I am excited about the release of the hardware #10 , as it provides us with fascinating insights into this repository. I am interested in learning more details about FMU. Can you please provide me with the following information?

1.Is the FMU driven using PX4 firmware or a self-defined one?
2.If PX4 firmware is used, which version of firmware should I install?
3.If a self-defined firmware is used, how can I use PX4 Autopilot to control the FMU?

Hardware list

It is interesting if the developers include the BOM for this small drone to have a comprehensive experience with this repo, is it possibile? anyway great job!

Sorry for a meaningless pull request

I want to express my sincere apologies for the low-quality pull request that I submitted to your project. Since I am not very familiar with git commands, I submitted the pull request by mistake. I understand that it was not up to the standards of the project, and I regret any inconvenience or confusion that it may have caused.

I have since closed the pull request.Thank you for your understanding and for your efforts in maintaining this project.

Yaw will step violently in the beginning of the planning

hello! when I use Ego Planner v2 to plan a smooth path in an airsim simulator. when the UAV is stopped, and I give a target that is opposite to the direction of the UAV head. the planner will give a path in which the position and velocity are smooth, but the yaw has a violent step. yaw will violently change from the front to the behind in the first pos_cmd frame. In Rviz, the pos_2_odom node will simply set the Odom ignoring Dynamic. However, because of the limited dynamics in my UAV simulator, my UAV will be out of control.

The problem can simply reappear in your demo launch single_drone_interactive.launch

Besides, according to my test, your previous works: Ego Planner v1 and Ego Planner Swarm don't have this problem.

So I want to ask if the problem can be solved by changing some code?

Plan global traj through multiple WPs

Hi, thanks for the amazing work. I was wondering if the EGOPlannerManager::planGlobalTrajWaypoints method in planner_manager.cpp can be used for multi-WP continuous global trajectory generation. By default the FSM logic seems to visit each WP one-by-one. I wanted your insight on how to go about adding the multi-WP continuous global traj generation. The above-mentioned function already has the core planning logic, but I wanted to get some clarity on how to integrate this with the FSM.

Localization and drift correction

Hi, your work is amazing ! Could you please confirm if the localization and drift correction components are included in the simulation code? Additionally, could you provide details on the drift correction section to improve the accuracy of drones? Thank you.

360 deg point cloud not showing up

Hi @bigsuperZZZX, I am passing the planner a 360 deg lidar data, but I'm able to visualize only the objects right in front of the copter, I've attached an image below, which illustrates it better

The solid red colour Illustrates the original point cloud I'm passing to the planner

thanks
Screenshot from 2022-05-24 10-05-44

Why is multi-threaded spinner not used?

Hi @bigsuperZZZX,
I am using the planner with point cloud data and I get this error again and again
[traj_server] Lost heartbeat from the planner, is it dead?

to get around this error, I used a multi-threaded spinner, which lead to the following error
A negtive value of nearby obstacle number! reset the map.

can you point out possible solutions to tackle the problem of heartbeat?

Planner Unable to create plan in case of wide obstacles

Hi,
I was trying to run the planner in an environment with wide obstacles
I have set <arg name="virtual_ceil" value="4.5" />

here's how the obstacle actually looks
Screenshot from 2023-02-17 16-45-54

when I am trying to plan a path with a goal on the other side of the obstacle, the planner is unable to make a plan. and I am constantly getting warnings
[ WARN] [1676632376.780805656, 1023.498000000]: Failed in A star path searching !!! 0.2 seconds time limit exceeded. [ WARN] [1676632376.780850817, 1023.498000000]: A-star error, force return!
do I need to change something to avoid such obstacles?

this is the grid map view of the obstacle
blockingg

thanks

ring buffer update

Hi @bigsuperZZZX, Thank you so much for this update
I have a few questions, I went through the code base and found map_size_x, map_size_y, map_size_z are there in launch files but it is not used in grid_map.cpp but it is used in gridmap_bigmap.cpp. can you please explain me, which one is used when?

the second question is, I see 4 workspaces, along with a visual representation. can you please share a brief of how they differ from each other fundamentally?

Thanks again and kudos for amazing work

Planner not planning the path and not stopping the drone at the end of trajectory

Hi @bigsuperZZZX,
I hope you are doing well. recently I was testing the Ego-Planner-v2 Implementation and I came across a situation where the planner planned a path, and the drone was following it, but the planner didn't send a 0 velocity command once the drone reached the waypoint. below I have attached the video demonstrating the issue and the corresponding logs. I have not modified the planner code
the crash happened in the end, hence the end part of the log file is more helpful

thanks

03-10-22-test3-2022-10-03-074557_o7jBL1P2.online-video-cutter.com.mp4

03_10_22_test3_planner (1).txt

Drone movement is too aggressive

hello .

I am implementing drone avoidance using ego planner v2 and ardupilot and mavros controller.

Everything is excellent! I admire your research.

However, when the obstacle disappears from the drone's field of view, it changes the yaw direction of the drone too abruptly. Also, it twists too much on narrow roads.

Can I make a smoother path if I fix anything?

Please refer to my video.

https://youtu.be/mMYVuqe9s_o

https://youtu.be/c3ECgOdWexE

[Missing Drone] No drones seen in rviz

Hello! Thank you very much for your excellent work!

I follow the tutorial in the video step by step and everything(planner, trajectory, swarm......) works well except that the drone itself is missing. I checked all the *.launch files in four subfolder and the drones are all invisible.

just as shown below:
missing_drone

I followed tutorial in your another work ego-planner-swarm as well. And I can see the drone:
normal_drone

I noticed that toggle on drone0/Simulation/robot makes the drone visible, but it's already on in the former launch.

I'm new to ROS, as well as rviz, launch file, etc and I'm not sure which log or message I should provide for troubleshooting. I'm looking forward to your reply, tkank you!

How is 3d point converted to 1D representation in buffer

Hi @bigsuperZZZX @USTfgaoaa ,
my understanding of functions such as GridMap::globalIdx2BufIdx and GridMap::globalIdx2InfBufIdx, is that it converts 3D point to 1D representation for a 1D buffer, correct me If I am wrong
I want to understand the working of the function mentioned below

inline int GridMap::globalIdx2BufIdx(const Eigen::Vector3i &id)
{
  int x_buffer = (id(0) - md_.ringbuffer_origin3i_(0)) % md_.ringbuffer_size3i_(0);
  int y_buffer = (id(1) - md_.ringbuffer_origin3i_(1)) % md_.ringbuffer_size3i_(1);
  int z_buffer = (id(2) - md_.ringbuffer_origin3i_(2)) % md_.ringbuffer_size3i_(2);
  if (x_buffer < 0)
    x_buffer += md_.ringbuffer_size3i_(0);
  if (y_buffer < 0)
    y_buffer += md_.ringbuffer_size3i_(1);
  if (z_buffer < 0)
    z_buffer += md_.ringbuffer_size3i_(2);

  return md_.ringbuffer_size3i_(0) * md_.ringbuffer_size3i_(1) * z_buffer + md_.ringbuffer_size3i_(0) * y_buffer + x_buffer;
}

I want to understand the working of this function, how you convert a 3D to 1D, and what's the intuition.

few questions related to inter drone communication

Hi @bigsuperZZZX,
I went through the paper and the supplementary material attached, and the documentation does an excellent job of explaining everything in a lot of detail.
I have a few questions related to inter-drone communication

  1. in the experiments you performed, were all drones connected to the same ROS master?
  2. is there an explicit need for time sync between all the drones? if yes, how did you achieve it?
  3. are we handling if there's some offset or delay in trajectories?

thanks

2D planning?

Hi, thank you for sharing this great work. I am wondering if there is a way to limit the planner to just planning the trajectory in the 2D plane? Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.