Giter Club home page Giter Club logo

orbslam-map-saving-extension's Introduction

ORB-SLAM2 map saving and loading for closed circuit localization

This project provides a ROS package for localization of an autonomous vehicle. The localization is based on a map consisting of ORB features. The mapping and localization module is taken from the ORB-SLAM2 implementation. Our project builds on top of the ROS-enabled version. In this extensions the map of ORB-features be saved to the disk as a reference for future runs along the same track. At high speeds, a map-based localization, may be prefered over a full SLAM due to better localization accuracy and global positioning reference. The use-case for the presented system is a closed circuit racing scenario. In a first step, we create a map at low speeds with a full SLAM. In further runs at higher speeds, we localize on the previously saved map to increase the localization accuracy. The usage is demonstrated with an example from the KITTI dataset. The flow chart of the extended functionality is shown in Figure 1.

Installation of Dependencies

The system is tested on Ubuntu 16.04. The system is depended on the following libraries as used by the original ORB-SLAM2 implementation:

OpenCV

We use OpenCV to manipulate images and features. Dowload and install instructions can be found at: http://opencv.org. Required at leat 2.4.3. Tested with OpenCV 2.4.11 and OpenCV 3.2.

Eigen3

Required by g2o. Download and install instructions can be found at: http://eigen.tuxfamily.org. Required at least 3.1.0.

ROS

ROS is required with this extended version since it offers the ORB-SLAM2 functionality in form of a ROS package and uses topic broadcasts to publish information for the camera position as a TransformationFrame, map points as PointCloud and the tracking state as String. Tested with ROS Kinetic Kame

Boost

The Map save/load extension uses the boost serialization library libboost_serialization. Install via:

sudo apt install libboost-serialization-dev

Octomap - ROS

Install the octomap ROS package via:

sudo apt-get install ros-kinetic-octomap ros-kinetic-octomap-mapping ros-kinetic-octomap-msgs  ros-kinetic-octomap-ros ros-kinetic-octomap-rviz-plugins ros-kinetic-octomap-server

Installation

  • Clone this repository in the src folder of a catkin workspace. For example:
cd catkin_ws/src
git clone https://gitlab.lrz.de/perception/orbslam-map-saving-extension
  • Build the package
cd ..
catkin_make
  • Source the workspace
source devel/setup.bash
  • Extract the ORBVoc.txt file in orb_slam2_lib/Vocabulary
cd src/orbslam-map-saving-extension/orb_slam2_lib/Vocabulary
tar -xf ORBvoc.txt.tar.gz
  • Transform the vocabulary file ORBvoc from .txt to .bin for faster loading
./bin_vocabulary

Usage Example

We demonstrate the usage of the package with example data from the KITTI dataset. This example consists of four steps. 1 - Download and convert Kitti data to compatible format 2 - Save an ORB feature map of the Kitti data 3 - Load the ORB map and perform localization at higher speeds. 4 - Visualize the mapping and localization trajectories:

1 - Prepare Kitti example data

  • Install kitti2bag for transforming RAW Kitti data to ROS bags.
pip install kitti2bag
  • Download the synced+rectified and calibration data of a KITTI RAW sequence. We choose the data 2011_09_30_drive_0027 of the Residential category:
$ wget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_30_drive_0027/2011_09_30_drive_0027_sync.zip
$ wget https://s3.eu-central-1.amazonaws.com/avg-kitti/raw_data/2011_09_30_calib.zip
$ unzip 2011_09_30_drive_0027_sync.zip
$ unzip 2011_09_30_calib.zip
  • Transform the RAW data to a ROS bag
kitti2bag -t 2011_09_30 -r 0027 raw_synced

The rosbag should be located at $(env HOME)/datasets/KITTI/kitti_2011_09_30_drive_0027_synced.bagto smoothly follow the example.

2 - Creation and saving of ORB feature map

Befor you launch the SLAM system, make sure the corresponding ROS workspace is sourced:

source devel/setup.bash

The SLAM is then started via:

roslaunch orb_slam2_ros orb_slam_kitti_from_bag.launch

After starting the launch file, the ORB vocabulary is loaded. Press the SPACE key to start the replay of the bag file and the mapping process.

When the bag data replay is finished, press Ctrl-C to shutdown the system and the map saving process is executed. The resulting ORB map file map.bin is saved in the current ros workspace at: catkin_ws/devel/lib/orb_slam2_ros/. The SLAM trajectory frame data FrameTrajectory_TUM_Format.txt is saved in the same folder.

Rename the trajectory frame data file to slam_FrameTrajectory_TUM_Format.txt, so it is not overwritten in the consecutive localization run.

Optional: Change the data path in the launch file to match the bagfile location on your system. The default data path in the launch file is: $(env HOME)/datasets/KITTI/kitti_2011_09_30_drive_0027_synced.bag The default ORB-SLAM settings file is configured to work with the KITTI example bag file. The mapping from the setting file numbers to futher KITTI sequences can be found in the Readme.

3 - Localization on the ORB feature map

The map loading and saving settings are found at the bottom of the ORB settings file. Edit the ORB settings file at orbslam-map-saving-extension/orb_slam2_ros/settings/KITTI04-12.yaml by setting Map.load_map: 1 to load the previously created map when the system is initialized. Additionally deactivate the map saving: Map.save_map: 0.

In the ROS launch file, we set the replay speed of the rosbag to 2x by changing -r 1 to -r 2 to simulate a faster run for the localization on the example data.

Now, re-run the launch file to start the localiation run:

roslaunch orb_slam2_ros orb_slam_kitti_from_bag.launch

The command line output confirms the successful loading of the map file map.bin. Press the SPACE key to start the replay of the bag file and the localization process.

When the bag data replay is finished, press Ctrl-C to shutdown the system. Now only the trajectory files are saved.

Rename the trajectory frame data to localization_FrameTrajectory_TUM_Format.txt, so it is not overwritten in consecutive localization runs.

4 - Visualization

For the visualization additional packages are needed:

pip install matplotlib
pip install evo 

Trajectory visualization

To visualize the trajectories of the SLAM and localization runs, a basic plotting script is provided in the main folder. For the example, simply run

python example_trajectory_plots.py

The ouput should be as follows:

SLAM and Localization Trajectories

We see that the localization visually matches with the mapping trajectory even at higher speeds. Note: If you run the example bag with 2x speed in SLAM mode, the mapping process will eventually fail due to failed feature matching.

Feature Visualization

During the SLAM or localization process you can visualize the matching of the ORB features. Therefore start rqt_image_view and listen to the /orb_slam2/frame topic. An example frame output for the localization mode is shown here:

ORB matching visualization

Additional information

KITTI Odometry Sequence Mapping for Settings File

The following table maps the setting file number to the name, start frame and end frame of each sequence that has been used to extract the visual odometry / SLAM training set:

Nr.     Sequence name     Start   End
---------------------------------------
00: 2011_10_03_drive_0027 000000 004540
01: 2011_10_03_drive_0042 000000 001100
02: 2011_10_03_drive_0034 000000 004660
03: 2011_09_26_drive_0067 000000 000800
04: 2011_09_30_drive_0016 000000 000270
05: 2011_09_30_drive_0018 000000 002760
06: 2011_09_30_drive_0020 000000 001100
07: 2011_09_30_drive_0027 000000 001100
08: 2011_09_30_drive_0028 001100 005170
09: 2011_09_30_drive_0033 000000 001590
10: 2011_09_30_drive_0034 000000 001200

Based on this table, for the example sequence 2011_09_30_drive_0027, the settings file KITTI04-12.yaml is the correct one.

Contributions

This implementation is created during the Interdisciplinary Project (IDP) of Odysseas Papanikolaou at the Institute of Automotive Technology of the Technical University of Munich. Contact: [email protected]

If you find our work useful in your research, please consider citing:

@INPROCEEDINGS{nobis20loc,
    author={Felix Nobis, Odysseas Papanikolaou, Johannes Betz and Markus Lienkamp},
    title={Persistent Map Saving for Visual Localization for Autonomous Vehicles: An ORB-SLAM\,2 Extension},
    booktitle={2020 Fifteenth International Conference on Ecological Vehicles and Renewable Energies (EVER)},
    year={2020}
}

orbslam-map-saving-extension's People

Contributors

fnobis avatar tumftm avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

orbslam-map-saving-extension's Issues

ORBState.h file missing

Hi,

I have tried to build the software in order to test it. I am unable to do so due to the missing file orb_slam2_ros/ORBState.h.

I was unable to locate the file neither in this repo nor in the original repo from Raul Mur-Artal et al.

Looking forward to your answer!

Kind regards,
Milos Vujadinovic

DBoW Voc

Whether the vocabulary file--ORBVoc.txt is already established before establishing the visual map? Or the ORBVoc.txt is established by the camera captured image dataset?

pangolin

I want to use pangolin to realize visualization. What should I do.

Replacing AMCL

From my understanding the usual ROS1 navigation stack uses AMCL to perform the transform from Map frame to the Odom frame. Is it possible for orbslam to replace this transformation and act as the localisation node? From the bag file I observe that the algorithm is giving an odom2 to camera transformation.

How is it possible to use this odom2 to camera transformation to replace the AMCL Map to Odom transformation?

image

localization is not work in rgbd mode after the map is built

Hello,i have a problem when i use this extension.Could you give me some suggestions to solve this problem.The details are as follow:
Before the problem occured.i had built the map.
screenshot_4
screenshot_5
After that,i modified the config file to change the slam mode.
image
Finally, i used the same command to rerun and find the map was reconstructed.
image
But when i useed control + c to get the trajectory file, an error occured.
image
And there was no trajectory file in lib folder.I have tried to solve this problem, and finally find the image callback is not work when map was reused!I can't understand why it not work , i just run the same code as the map built time used.

Deploying Map Saving Extension with ORBSLAM2 GUI

I was able to run the extension in real time with an Intel D435 depth camera. I modified some of the launch files to read the correct YUML file for my camera. I am able to deploy RVIZ and visualize both the map and the frame with the feature detection segmentation. I was wondering if there is a way to keep the original GUI from OrbSlam2 and having it deploy automatically on start up. I am attaching my files and command to start the process.

$ roslaunch realsense2_camera rs_rgbd.launch $ roslaunch orb_slam2_ros orb_slam_D435.launch

ros mono initialized , data: "Not initialized"

I make a orb_ros_mono.launch for monocular slam and Localization , but it does not initialized for kitti , but when I using the origin ORBSLAM2 / ROS / MONO , It works well; So what's wrong with me?
`

`

using mono mode for relocalization

It is greate when I using Stereo for map_saving ang map_reusing;
But when I change the mode to Mono and the camera topic for Mono,
<node pkg="orb_slam2_ros" type="Mono" name="orb_slam2" args="$(find orb_slam2_lib)/Vocabulary/ORBvoc.bin $(arg orb_settings_file) False" cwd="node" output="screen"> <remap from="kitti/camera_gray_left/image_raw" to="/camera/image_raw"/>
And the process has died when I roslaunch for map_saving,
[orb_slam2-3] process has died [pid 17959, exit code 1, cmd /home/xxx/orbslam_ws/devel/lib/orb_slam2_ros/Mono /home/xxx/orbslam_ws/src/orbslam-map-saving-extension/orb_slam2_lib/Vocabulary/ORBvoc.bin /home/xxx/orbslam_ws/src/orbslam-map-saving-extension/orb_slam2_ros/settings/KITTI04-12.yaml False __name:=orb_slam2 __log:=/home/xxx/.ros/log/038d10b0-cfc5-11ea-8e1d-4c1d9634addc/orb_slam2-3.log]. log file: /home/xxx/.ros/log/038d10b0-cfc5-11ea-8e1d-4c1d9634addc/orb_slam2-3*.log
I check the configuration is all fine , and the map.bin and trajectory file were not generated.
Do you know what should I do to overcame these problems?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.