Giter Club home page Giter Club logo

kr_autonomous_flight's Introduction

alt text

This is the autonomous flight code stack used at KumarRobotics, providing a complete solution for GPS-denied quadcopter autonomy. It has been tested extensively in challenging urban and rural (under forest canopy) environments.

Docker Build Base Docker Build Client Docker Build Control Docker Build Estimation Docker Build Map_plan Docker Build State_machine Docker Build Sim

High-level Code Structure

alt text

Documentation and Instructions

Please refer to our Wiki page for detailed instructions about how to use this code.

Videos

Real-world experiments in large-scale autonomous flight with real-time semantic SLAM in forests

Real-world experiments in fast, autonomous, GPS-Denied quadrotor flight

Simulation experiments in fast, autonomous flight in urban and rural environments

Contributing

Report issues: Open an issue on Github.

Merge code changes: Open a pull request on Github.

Citation

If you use our code in your work, please cite:


@article{mohta2018fast,
  title={Fast, autonomous flight in GPS-denied and cluttered environments},
  author={Mohta, Kartik and Watterson, Michael and Mulgaonkar, Yash and Liu, Sikang and Qu, Chao and Makineni, Anurag and Saulnier, Kelsey and Sun, Ke and Zhu, Alex and Delmerico, Jeffrey and Thakur, Dinesh and Karydis, Konstantinos and Atanasov, Nikolay and Loianno, Giuseppe and Scaramuzza, Davide and Daniilidis, Kostas and Taylor, Camillo Jose and Kumar, Vijay},
  journal={Journal of Field Robotics},
  volume={35},
  number={1},
  pages={101--120},
  year={2018}
}
@article{mohta2018experiments,
  title={Experiments in fast, autonomous, gps-denied quadrotor flight},
  author={Mohta, Kartik and Sun, Ke and Liu, Sikang and Watterson, Michael and Pfrommer, Bernd and Svacha, James and Mulgaonkar, Yash and Taylor, Camillo Jose and Kumar, Vijay},
  booktitle={2018 IEEE International Conference on Robotics and Automation (ICRA)},
  pages={7832--7839},
  year={2018},
  organization={IEEE}
}
@article{liu2022large,
  title={Large-Scale Autonomous Flight With Real-Time Semantic SLAM Under Dense Forest Canopy},
  author={Liu, Xu and Nardari, Guilherme V. and Ojeda, Fernando Cladera and Tao, Yuezhan and Zhou, Alex and Donnelly, Thomas and Qu, Chao and Chen, Steven W. and Romero, Roseli A. F. and Taylor, Camillo J. and Kumar, Vijay},
  journal={IEEE Robotics and Automation Letters},
  year={2022},
  volume={7},
  number={2},
  pages={5512-5519},
}

Acknowledgement

This is a multi-year project. We gratefully acknowledge contributions from our current members and lab alumni. The contributors can be found in the papers listed above. In addition, Laura Jarin-Lipschitz, Justin Thomas also contributed to this work. We also thank our funding agencies.

License

This code is released using the Penn Software Licence. Please refer to LICENSE.txt for details.

kr_autonomous_flight's People

Contributors

ankitvp77 avatar falconkumarlab avatar fcladera avatar gnardari avatar kartikmohta avatar ljarin avatar orlando21lara avatar shaoyifei96 avatar spencerfolk avatar tyuezhan avatar versatran01 avatar xurobotics avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kr_autonomous_flight's Issues

Missing header file

During the compilation of engineering times wrong: I lack Path. H, SplineTrajectory. J h, VoxelMap. H. I really don't think I can find any of these header files

Python linting

It would be nice to have python linting for our code using reviewdog

Fix A star error in JPS3D/graph_search.cpp

Current status: printf("ASTAR ERROR!\n"); is called in grap_search.cpp. This does not seem to affect the performance of the stack.

Ideal scenario: we would like to understand why this is happening and fix it. Removing the printf is not acceptable.

Imu quality

Res sir,
What is the minimum imu quality or imu type required for realtime control of robot using vio because vn imu are very costly and can it work with imu used on smartphone.
Thank you

[Build Error] Could not find a package configuration file provided by "TBB" with any of the following names: TBBConfig.cmake, tbb-config.cmake

System Configuration
OS : Ubuntu 18.04
ROS Distro : Melodic

Errors     << fake_lidar:cmake /home/$(USER)/kr_ws/logs/fake_lidar/build.cmake.003.log              
CMake Error at /home/$(USER)/kr_ws/src/kr_autonomous_flight/autonomy_sim/unity_sim/fake_lidar/CMakeLists.txt:6 (list):
  list does not recognize sub-command PREPEND


CMake Error at /home/$(USER)/kr_ws/src/kr_autonomous_flight/autonomy_sim/unity_sim/fake_lidar/CMakeLists.txt:21 (find_package):
  By not providing "FindTBB.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "TBB", but
  CMake did not find one.

  Could not find a package configuration file provided by "TBB" with any of
  the following names:

    TBBConfig.cmake
    tbb-config.cmake

  Add the installation prefix of "TBB" to CMAKE_PREFIX_PATH or set "TBB_DIR"
  to a directory containing one of the above files.  If "TBB" provides a
  separate development package or SDK, be sure it has been installed.


cd /home/$(USER)/kr_ws/build/fake_lidar; catkin build --get-env fake_lidar | catkin env -si  /usr/bin/cmake /home/$(USER)/kr_ws/src/kr_autonomous_flight/autonomy_sim/unity_sim/fake_lidar --no-warn-unused-cli -DCATKIN_DEVEL_PREFIX=/home/$(USER)/kr_ws/devel/.private/fake_lidar -DCMAKE_INSTALL_PREFIX=/home/$(USER)/kr_ws/install -DCMAKE_BUILD_TYPE=Release; cd -

Write mapper as a ROS service

Currently, the mapper is a node that keeps publishing global and local maps regardless of whether any node is using them. This results in unnecessary message publication and computation (e.g. local map is published at 10Hz but the local planner only uses it at 2Hz).

The most efficient way is to write mapper as a ROS service. Whenever the planner needs the map, it will have the mapper-client send a request for this service, and obtain the map at that timestamp.

Differentiate 2D and 3D planner in local replan server

Current status: When we initialize a planner instance, we always initialize it in 3D and we are setting the control inputs to zero in the z-direction. We did this for visualization purposes (because we need a 3D voxel map), but this is not optimal in terms of compute.

Ideal scenario: we would like the planner to be initialized in 2D. We may need to modify the current planner-util package to be able to use a slice of the 3D map (but we still can visualize it in rviz).

Move global and storage maps along with robot

Reinitialize the global and storage maps to be centered around the robot after it moves a certain distance, so that the robot can handle arbitrarily large environments.

Currently, the global map and storage map have fixed origins and dimensions. The local map is cropped out of the storage map according to the robot center.

One easy way to achieve this is to clear all voxels of the global and storage map, reinitilize them to be centered around the robot whenever the robot is close to the boundaries of the global / storage map (while doing so, publish a state trigger for stopping policy to guarantee safety).

FYI, Storage map only exists to speed up the generation of the local map. A storage map is a map that is large and has the same resolution as the local map. Given a local map center and dimension, we just crop a submap from the storage map. Global map has similar dimensions as storage map but is much more coarsely discretized. The idea is that the global planner plans in global map and the local planner plans in the small local map, thus increasing computation efficiency, while not comprising much on the optimality (local planner can still deviate from the global path, and fly through small gaps if they exist).

Make the voxel map's default values for occupied, unknown and free voxels as parameters in VoxelMap.msg

Currently, the default values for occupied, even, unknown and free voxels are hard-coded, respectively, as val_occ, val_even, val_unkonwn, and val_free. The planner and mapper have a consensus on the hard-coded values. This may cause problems, e.g., when someone wants to use different planners.

A better way to do this is to, in the VoxelMap.msg, add parameters describing those default values that describe the occupancy status.

(val_even is a value between val_free and val_unkonwn, and when the mapper decay is turned on, it is used as a threshold for determining the occupancy status).

protobuf issue - Errors << ouster_gazebo_plugins:check /home/minasm/catkin_ws/logs/ouster_gazebo_plugins/build.check.010.log

hi,

How to fix build issue:

protoc --version
libprotoc 3.21.12

catkin build throws the following error:

Errors << ouster_gazebo_plugins:check /home/minasm/catkin_ws/logs/ouster_gazebo_plugins/build.check.010.log
CMake Warning (dev) at CMakeLists.txt:2 (project):
Policy CMP0048 is not set: project() command manages VERSION variables.
Run "cmake --help-policy CMP0048" for policy details. Use the cmake_policy
command to set the policy and suppress this warning.

The following variable(s) would be set to empty:

CMAKE_PROJECT_VERSION
CMAKE_PROJECT_VERSION_MAJOR
CMAKE_PROJECT_VERSION_MINOR
CMAKE_PROJECT_VERSION_PATCH

This warning is for project developers. Use -Wno-dev to suppress it.

catkin build

Profile: default
Extending: [cached] /opt/ros/noetic
Workspace: /home/minasm/catkin_ws

Build Space: [exists] /home/minasm/catkin_ws/build
Devel Space: [exists] /home/minasm/catkin_ws/devel
Install Space: [unused] /home/minasm/catkin_ws/install
Log Space: [exists] /home/minasm/catkin_ws/logs
Source Space: [exists] /home/minasm/catkin_ws/src
DESTDIR: [unused] None

Devel Space Layout: linked
Install Space Layout: None

Additional CMake Args: -DCMAKE_BUILD_TYPE=Release
Additional Make Args: None
Additional catkin Make Args: None
Internal Make Job Server: True
Cache Job Environments: False

Buildlisted Packages: None
Skiplisted Packages: None

Workspace configuration appears valid.

[build] Found 50 packages in 0.0 seconds.
[build] Package table is up to date.
Starting >>> client_launch
Starting >>> control_launch
Starting >>> estimation_launch
Starting >>> fla_ukf
Starting >>> gazebo_utils
Starting >>> kr_mav_msgs
Starting >>> kr_mesh_visualization
Starting >>> kr_planning_msgs
Starting >>> kr_tracker_msgs
Starting >>> map_plan_launch
Starting >>> mrsl_models
Starting >>> mrsl_quadrotor_description
Starting >>> mrsl_quadrotor_utils
Starting >>> msckf_vio
Starting >>> ouster_client
Starting >>> ouster_description
Starting >>> ouster_gazebo_plugins
Starting >>> px4_interface_launch
Starting >>> real_experiment_launch
Starting >>> state_machine_launch
Starting >>> waypoint_navigation_plugin
Finished <<< control_launch [ 0.1 seconds ]
Finished <<< ouster_description [ 0.1 seconds ]
Finished <<< estimation_launch [ 0.1 seconds ]
Finished <<< real_experiment_launch [ 0.1 seconds ]
Finished <<< fla_ukf [ 0.2 seconds ]
Finished <<< state_machine_launch [ 0.1 seconds ]
Finished <<< kr_tracker_msgs [ 0.3 seconds ]
Finished <<< map_plan_launch [ 0.1 seconds ]

Reliable integration of LIDAR odometry into autonomy stack

We have tried the integration of LIDAR odometry into UKF, using LLOL. We successfully flew waypoint missions, using LIDAR only for both odometry and mapping (i.e. no cameras were needed for the autonomous flight).

However, this is still in an early stage and we have not done large-scale tests. It would be good to test and compare the odometry performance with different combinations of sensors, including (1) stereo-VIO-only (2) LIDAR-odom-only (3) stereo-VIO + LIDAR-odom. Also it would be good to idenfy degenerative cases for different combinations.

The interfaces to support different sensors are all available in the UKF package included in this repo.

Should test this extensively on the large-scale forest datasets we've collected, which can be challenging for LIDAR-odom.

GPS integration in kr_autonomous_flight

Currently GPS is not used in our stack. It would be great to integrate GPS for long-term drift correction.

This is probably best-done using our two reference frame module.

[quadrotor/rqt_gui_buttons-22] process has died

Thanks for open source your excellent work, I got the following error while executing "roslaunch gazebo_utils full_sim.launch":

[quadrotor/rqt_gui_buttons-22] process has died [pid 25168, exit code 1, cmd /opt/ros/melodic/lib/rqt_gui/rqt_gui --perspective-file /home/hy/kr_UAV_ws/src/kr_autonomous_flight/autonomy_core/client /client_launch/config/client_gui.perspective robot:=quadrotor __name:=rqt_gui_buttons __log:=/home/hy/.ros/log/e77eaff4-326f-11ed-8883-9c2976e53ddc/quadrotor-rqt_gui_buttons-22.log].
log file: /home/hy/.ros/log/e77eaff4-326f-11ed-8883-9c2976e53ddc/quadrotor-rqt_gui_buttons-22*.log

How to solve it?

Screenshot from 2022-09-12 15-56-15

How to tune PID Control Gains?

This project is an amazing work. And I also try to learn and run it on real UAV.
But I have some confused about tuning SO3 PID gains. There are 3 gains: position gains, velocity gains and position error integral gains.Which gains should be first tuned and how to do this step?
I have a little experience of using Pixhawk 4. I know if I want stable flight controlled by position setpoint, I should tune attitude loop firstly and then tune velocity loop and finally tune position loop. In so3 controller, is it the same step to tune these gains? Would you like to tell me tricks about it? Thank you !

Voxel map storage order and its implication.

https://github.com/KumarRobotics/autonomy_stack/blob/3231c89b163faa78fd1b018cf361514129a2601a/map_plan/pkgs/mapper/src/voxel_mapper.cpp#L156

The map storage is a boost multi_array which defaults to c_storage_order https://www.boost.org/doc/libs/1_63_0/libs/multi_array/doc/user.html#sec_storage
which is "row-major".

The iteration pattern is

for (n(0) = 0; n(0) < dim_(0); n(0)++) {
    for (n(1) = 0; n(1) < dim_(1); n(1)++) {
      for (n(2) = 0; n(2) < dim_(2); n(2)++) {
	    int idx = n(0) + dim_(0) * n(1) + dim_(0) * dim_(1) * n(2);
        data[idx] = array[i][j][k]

Here data is just a vector which is 1d.
The computation of idx is col-major order.
https://en.wikipedia.org/wiki/Row-_and_column-major_order#Address_calculation_in_general

Correct me if I'm wrong but this seems suboptimal.

Make local voxel map z-axis center aligned with robot odometry z

Currently, the global and local map z-axis ranges are assumed to be the same (or very similar). Therefore, the bottom (i.e. z-axis origin) of the local voxel map is just set the same as that of the global map. However, when planning in 3D, we sometimes do want different z-ranges for local and global voxel maps.

We need to change the local voxel map center to be aligned with the robot odometry instead. Be careful that we may also need to adjust the storage map to adapt to this change. The code changes should be minimal, but extensive testing is needed to make sure edge cases are handled properly by running 3D planning in various cluttered environments.

Update:
Would be great if we can (1) reduce the dimensions of the storage map, or (2) get rid of the storage map completely.

For (1) one idea is to have the storage map's dimensions be resonably (e.g. 2 times) larger than the local map, and reinitialize it once in a while to be centered around the robot.

For (2) we need to think about how to initilize and update the local map efficiently (running up to 10hz).

Either case, we will need to think carefully about how to reinitialize maps (i.e. you need to accumulate point clouds, instead of throwing the history point cloud information away every time when you move the local map). This paper can be helpful: https://arxiv.org/abs/1703.01416.

Roadmap

Let's just put all the stuff that we want here. I will start.

  1. Change mapper to a ros service and send voxelgrid on request.
  2. Add 3D local motion-primitive planner
  3. #57 Adding more dependencies

Trying to remove template on Dim in MPL

To template all mpl code on dim is a bad design choice.
It cripples compilation speed and expose unnecessary dependencies in header files.
There's no need for a special 2d case, as one could simply set all z component to 0 in 3d.

Reliable integration of GPS into autonomy stack

It would be good to test and compare the odometry performance with different combinations of sensors, including (1) stereo-VIO-only (2) stereo-VIO + GPS.

One option is to directly feed GPS into the UKF, the interfaces to support different sensors are all available in the UKF package included in this repo. However, this may cause problem if the difference between last UKF estimated pose and GPS estimated pose is too large (which can happen if you have intermittent GPS).

Alternatively, the GPS poses can be treated the same way as semantic SLAM estimated poses (publish odom-reference-frame to SLAM-reference-frame transform to compensate the odometry drift), however, we need to think about how to inform the UKF of this additional source of odometry, so that the covariance of UKF can be updated.

This will probably only work in situations where GPS is reliable enough.

Migrate docker builds to noetic

From Slack:

  • There are some small changes one needs to make to the arl stack but only a few lines to some python script.
  • Some rosflight script uses python2 exception and print syntax.
  • Some external repo like warthog and husky need to be cloned manually.

ERROR message "[Traj Tracker:] Max position error ..."

image
During my real uav experiment with the propellers all removed, after start the system, i get the uav to the position of takeoff height, then let the uav stay still, put a waypoint far from the uav in rviz, then press publish waypoints buttons and Execute Waypoints Mission button, after some seconds, the error message appeared as shown in the image above, what the error messages mean ? I guess, if the uav executes real flight, the error messages may not appear, is that ture?
image
Very Thanks!
@versatran01

Remove pkgs folder in each sub system

The pkgs folder is a bit unnecessary no that we've removed a lot of unused packages.
It also adds one level in the file explorer tree which is annonying

small offsets in UKF odometry VS input odometry

Hi, I'm running the UKF code with LIO odometry and raw IMU data. the thing is, everything is working well except, there are small offsets in UKF output odometry and my LIO odometry.
image

in this image: orange plot is LIO odometry (/Odometry/pose/pose/position/z) and pink plot is UKF output odometry plot(/odom_prop/pose/pose/position/z)

this offset constantly increases from takeoff and becomes constant later on (at a value of 15-23cm)

there is a similar thing happening in x direction as well but the offset value is just 8cm there, and there is almost no such offset in y direction.

I've tried changing the UKF config but there is not any visible change in offsets (max offset is still 23cm in z direction)

Any help is appreciated.

Thanks.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.