Giter Club home page Giter Club logo

orb_slam's Introduction

##Check out our new ORB-SLAM2 (Monocular, Stereo and RGB-D)

ORB-SLAM Monocular

Authors: Raul Mur-Artal, Juan D. Tardos, J. M. M. Montiel and Dorian Galvez-Lopez (DBoW2)

Current version: 1.0.1 (see Changelog.md)

ORB-SLAM is a versatile and accurate Monocular SLAM solution able to compute in real-time the camera trajectory and a sparse 3D reconstruction of the scene in a wide variety of environments, ranging from small hand-held sequences to a car driven around several city blocks. It is able to close large loops and perform global relocalisation in real-time and from wide baselines.

See our project webpage: http://webdiis.unizar.es/~raulmur/orbslam/

###Related Publications:

[1] Raúl Mur-Artal, J. M. M. Montiel and Juan D. Tardós. ORB-SLAM: A Versatile and Accurate Monocular SLAM System. IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, 2015. (2015 IEEE Transactions on Robotics Best Paper Award). PDF.

[2] Dorian Gálvez-López and Juan D. Tardós. Bags of Binary Words for Fast Place Recognition in Image Sequences. IEEE Transactions on Robotics, vol. 28, no. 5, pp. 1188-1197, 2012. PDF.

#1. License

ORB-SLAM is released under a GPLv3 license. For a list of all code/library dependencies (and associated licenses), please see Dependencies.md.

For a closed-source version of ORB-SLAM for commercial purposes, please contact the authors: orbslam (at) unizar (dot) es.

If you use ORB-SLAM in an academic work, please cite:

@article{murAcceptedTRO2015,
  title={{ORB-SLAM}: a Versatile and Accurate Monocular {SLAM} System},
  author={Mur-Artal, Ra\'ul, Montiel, J. M. M. and Tard\'os, Juan D.},
  journal={IEEE Transactions on Robotics},
  volume={31},
  number={5},
  pages={1147--1163},
  doi = {10.1109/TRO.2015.2463671},
  year={2015}
 }

#2. Prerequisites (dependencies)

##2.1 Boost

We use the Boost library to launch the different threads of our SLAM system.

sudo apt-get install libboost-all-dev 

##2.2 ROS We use ROS to receive images from the camera or from a recorded sequence (rosbag), and for visualization (rviz, image_view). We have tested ORB-SLAM in Ubuntu 12.04 with ROS Fuerte, Groovy and Hydro; and in Ubuntu 14.04 with ROS Indigo. If you do not have already installed ROS in your computer, we recommend you to install the Full-Desktop version of ROS Fuerte (http://wiki.ros.org/fuerte/Installation/Ubuntu).

If you use ROS Indigo, remove the depency of opencv2 in the manifest.xml.

##2.3 OpenCV We use OpenCV to manipulate images and features. If you use a ROS version older than ROS Indigo, OpenCV is already included in the ROS distribution. In newer version of ROS, OpenCV is not included and you will need to install it. We tested OpenCV 2.4. Dowload and install instructions can be found at: http://opencv.org/

##2.4 g2o (included in Thirdparty) We use a modified version of g2o (see original at https://github.com/RainerKuemmerle/g2o) to perform optimizations. In order to compile g2o you will need to have installed Eigen3 (at least 3.1.0).

sudo apt-get install libeigen3-dev

##2.5 DBoW2 (included in Thirdparty) We make use of some components of the DBoW2 and DLib library (see original at https://github.com/dorian3d/DBoW2) for place recognition and feature matching. There are no additional dependencies to compile DBoW2.

#3. Installation

  1. Make sure you have installed ROS and all library dependencies (boost, eigen3, opencv, blas, lapack).

  2. Clone the repository:

     git clone https://github.com/raulmur/ORB_SLAM.git ORB_SLAM
    
  3. Add the path where you cloned ORB-SLAM to the ROS_PACKAGE_PATH environment variable. To do this, modify your .bashrc and add at the bottom the following line (replace PATH_TO_PARENT_OF_ORB_SLAM):

     export ROS_PACKAGE_PATH=${ROS_PACKAGE_PATH}:PATH_TO_PARENT_OF_ORB_SLAM
    
  4. Build g2o. Go into Thirdparty/g2o/ and execute:

     mkdir build
     cd build
     cmake .. -DCMAKE_BUILD_TYPE=Release
     make 
    

    Tip: To achieve the best performance in your computer, set your favorite compilation flags in line 61 and 62 of Thirdparty/g2o/CMakeLists.txt (by default -03 -march=native)

  5. Build DBoW2. Go into Thirdparty/DBoW2/ and execute:

     mkdir build
     cd build
     cmake .. -DCMAKE_BUILD_TYPE=Release
     make  
    

    Tip: Set your favorite compilation flags in line 4 and 5 of Thirdparty/DBoW2/CMakeLists.txt (by default -03 -march=native)

  6. Build ORB_SLAM. In the ORB_SLAM root execute:

    If you use ROS Indigo, remove the depency of opencv2 in the manifest.xml.

     mkdir build
     cd build
     cmake .. -DROS_BUILD_TYPE=Release
     make
    

    Tip: Set your favorite compilation flags in line 12 and 13 of ./CMakeLists.txt (by default -03 -march=native)

#4. Usage

See section 5 to run the Example Sequence.

  1. Launch ORB-SLAM from the terminal (roscore should have been already executed):

     rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE
    

You have to provide the path to the ORB vocabulary and to the settings file. The paths must be absolute or relative to the ORB_SLAM directory.
We already provide the vocabulary file we use in ORB_SLAM/Data/ORBvoc.txt.tar.gz. Uncompress the file, as it will be loaded much faster.

  1. The last processed frame is published to the topic /ORB_SLAM/Frame. You can visualize it using image_view:

     rosrun image_view image_view image:=/ORB_SLAM/Frame _autosize:=true
    
  2. The map is published to the topic /ORB_SLAM/Map, the current camera pose and global world coordinate origin are sent through /tf in frames /ORB_SLAM/Camera and /ORB_SLAM/World respectively. Run rviz to visualize the map:

    in ROS Fuerte:

     rosrun rviz rviz -d Data/rviz.vcg
    

    in ROS Groovy or a newer version:

     rosrun rviz rviz -d Data/rviz.rviz
    
  3. ORB_SLAM will receive the images from the topic /camera/image_raw. You can now play your rosbag or start your camera node. If you have a sequence with individual image files, you will need to generate a bag from them. We provide a tool to do that: https://github.com/raulmur/BagFromImages.

Tip: Use a roslaunch to launch ORB_SLAM, image_view and rviz from just one instruction. We provide an example:

in ROS Fuerte:

roslaunch ExampleFuerte.launch

in ROS Groovy or a newer version:

roslaunch ExampleGroovyOrNewer.launch

#5. Example Sequence We provide the settings and the rosbag of an example sequence in our lab. In this sequence you will see a loop closure and two relocalisation from a big viewpoint change.

  1. Download the rosbag file:
    http://webdiis.unizar.es/~raulmur/orbslam/downloads/Example.bag.tar.gz.

    Alternative link: https://drive.google.com/file/d/0B8Qa2__-sGYgRmozQ21oRHhUZWM/view?usp=sharing

    Uncompress the file.

  2. Launch ORB_SLAM with the settings for the example sequence. You should have already uncompressed the vocabulary file (/Data/ORBvoc.txt.tar.gz)

in ROS Fuerte:

  roslaunch ExampleFuerte.launch

*in ROS Groovy or newer versions*:

  roslaunch ExampleGroovyHydro.launch
  1. Once the ORB vocabulary has been loaded, play the rosbag (press space to start):

     rosbag play --pause Example.bag
    

#6. The Settings File

ORB_SLAM reads the camera calibration and setting parameters from a YAML file. We provide an example in Data/Settings.yaml, where you will find all parameters and their description. We use the camera calibration model of OpenCV.

Please make sure you write and call your own settings file for your camera (copy the example file and modify the calibration)

#7. Failure Modes

You should expect to achieve good results in sequences similar to those in which we show results in our paper [1], in terms of camera movement and texture in the environment. In general our Monocular SLAM solution is expected to have a bad time in the following situations:

  • No translation at system initialization (or too much rotation).
  • Pure rotations in exploration.
  • Low texture environments.
  • Many (or big) moving objects, especially if they move slowly.

The system is able to initialize from planar and non-planar scenes. In the case of planar scenes, depending on the camera movement relative to the plane, it is possible that the system refuses to initialize, see the paper [1] for details.

orb_slam's People

Contributors

raulmur avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

orb_slam's Issues

Fail to initialize the relative pose of first two frames

Hi, @raulmur r
I am sorry to trouble you but I am facing a very hard problem
I was using your method to compute the relative pose of the first two frames.
but I always get two solutions, and the value of "bestGood" and "secondBestGood" nealy the same.
I am looking forward to hearing from you and best wishes to you.
thanks in advance

                                                    sincerely
                                                       Li Qile

multiple camera with orb_slam

Hi

I have a question about how to integrate multiple cameras in ORB_SLAM in a way that each can use the map generated from the other one (not generate new map) for localization.

Thanks

Camera specs

Hi Raul,
thanks for your work.
Can you tell me which camera did you use and if there are some camera specs that improve the performance more than others?
Thanks

Mauro

How to compute the relative pose

Hello, I am looking into your code recently
But I could not understand how to compute the relative pose between the first two frames.
could you please give me some references? thanks

used equipement

Hi,
Can you tell me please witch camera (and other equipments,PC.... ) did you use, thank you.
Best regards.

Tracking on Corner-Poor Environments: Alternate Feature Detectors/Descriptors

Hello,
It would be interesting to combine this SLAM system with a feature detector and descriptor other than ORB, primarily to compare performance in special use cases, such as environments with few corners. However the ORB version implemented here is greatly extended with specific functionality which probably wouldn't be easy to convert.

  • Does anyone have an idea on this subject?
  • Can a SIFT feature detector be combined with an ORB descriptor?

error: Wrong path to settings

When I launch ORB-SLAM from the terminal by "rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE",There will be a mistake - [ERROR] [1438841720.919284826]: Wrong path to settings. Path must be absolut or relative to ORB_SLAM package directory .
ps:I have provided the path of ORB_SLAM directory to the environment variable of PATH_TO_SETTINGS_FILE

Using ATAN camera model

I'm using a very wide FOV camera (go pro), so I'd like to use the FOV camera model (Deverneay and Faugeras) instead of the provided model that OpenCV uses. I believe I've modified the code correctly, as I've double-checked my math by hand (by observing and verifying the final undistorted coordinates I feed back into the ORB-SLAM code) and the features have no problems tracking near the center of the frame. However, I'm having trouble with disappearing features at the edges of the frame (maybe the first 20% of each edge), so I wanted to check here and see if the authors had maybe implemented some checks to delete features that were corrected by the calibration to be too far outside of the image bounds eg. a feature at (10,10) in an image of size (1920,1080) that was then corrected to (-150,-50).

In case I haven't analyzed the ORB-SLAM code well, I should probably also mention what changes I've made to switch to the FOV model. In Data/Settings.yaml, I set fx,fy,cx,cy according to my calibration, and set k1,k2,p1,p2 to 0. I modified the call to cv::undistortPoints made in Frame.cc:UndistortKeyPoints and removed the last parameter. This allowed cv::undistortPoints to return normalized coordinates, which I then fed into my own function to account for radial distortion. I set the contents of mat to the values my function returned.

Thanks,
Sid

error:undefined symbol

hi, I am running your code with dataset from tum-vision
but an error occured, and the process died.

[ INFO] [1429678011.002510623]: New Map created with 425 points /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: symbol lookup error: /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: undefined symbol: _ZN3g2o17EdgeSE3ProjectXYZC1Ev
[ORB_SLAM-3] process has died [pid 13939, exit code 127, cmd /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3.log]. log file: /home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3*.log

could you please tell me what's wrong with my options? thanks in advance
best

saving/loading maps

This is pretty awesome, thanks for sharing!

Is there currently a way to save and load maps?

about develop tool

hi, @raulmur I am ready to learn your algorithm and code,
could you please give me some suggestions about develop tool? or what kinds of tools do you use to develop orbslam (et. vim , emacs or other ides)?
thanks a lot!

Code for learning vocab tree

@raulmur:

Hi,
I was wondering if you could point me to the code you used to train the vocab tree. Is it something you put together or is it part of Dbow2?
Thanks
Nikhil

error

I was running ORB SLAM on my own data set , I got this error.

terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
[ORB_SLAM-3] process has died [pid 7918, exit code -6, cmd /home/shabhu/Work/rosfuerte/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3.log].
log file: /home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3*.log

Can you help me out?

ORB_SLAM/Frame, Map ,Camera, World

Hi,

can you show me where I can find Frame , Map,Camera and World because I don't find them under ORB_SLAM (in general I think that I didn't understand those two parts :
2-The last processed frame is published to the topic /ORB_SLAM/Frame. You can visualize it using image_view and number 3).

Thank you a lot.

Erreur de segmentation (core dumped)

Hi all, I'm trying to install ORB_SLAM but when I do rosrun ORB_SLAM ORB_SLAM...... I get this error: Erreur de segmentation (core dumped)
any help please, and thank you.

Error while building ORB_SLAM

Hello,

I am trying to build ORB_SLAM package. I am using ROS-Indigo on ubuntu 14.04. I get the following error on step 6 (https://github.com/raulmur/ORB_SLAM).

~/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=Release
-- Found PythonInterp: /usr/bin/python (found version "2.7.6")
[rosbuild] Building package ORB_SLAM
Failed to invoke /opt/ros/indigo/bin/rospack deps-manifests ORB_SLAM
[rospack] Error: package 'ORB_SLAM' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'

CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:129 (message):

Failed to invoke rospack to get compile flags for package 'ORB_SLAM'. Look above for errors from rospack itself. Aborting. Please fix the broken dependency!

Call Stack (most recent call first): /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:207 (rosbuild_invoke_rospack)
CMakeLists.txt:4 (rosbuild_init)

-- Configuring incomplete, errors occurred!
See also "/home/steve/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".

Can somebody please help me out?

Could you provide different kinds of ORB vocabulary

I have installed ORB_SLAM on the odroid u3.
However, the ORBvoc file is too big to run on the odroid board.
I would appreciate it if you could offer another smaller ORB vocabulary file just for indoor environment.

Tf Transform

Hi raul,
Thanks for your contribution. I was wondering that why tf frames behave strangely on rviz? I have found that ORB-SLAM/Camera is fixed and world is moving, why? Sometimes I also noticed that ORB-SLAM/Camera is also rotating ! How come I convert it to real world coordinate? Does tf Transform give us camera pose information? Thanks in advance.

Always" not initialize"

Dear Raul Mur
Hello, this is Li Qile
I am running your code with my own data recently. but the orb slam cannot initialize, I can only see green lines on the /ORB_SLAM/Frame
May I ask you for some suggestions to fix such situsations?
Thanks a lot
Best
Li Qile

Some questions on your test process and result (TUM RGB-D Benchmark)

I am testing ORB-SLAM (and LSD-SLAM) on the TUM RGB-D Benchmark dataset, and I have some question on your test process:

ORB questions:

  • Did you only test with KeyFrames, or did you test with the pose of alle the frames?
  • Did you match on timing to find the correct match between ORB-SLAM poses and groundtruth poses?
  • Does the hardware performance affect the RMSE error?
  • Does ORB-SLAM rectify images by itself, or do they have to be pre-rectified?
  • Did you use standard ROS kinect intrinsic parameters, or did you use the fr(1,2,3) calibrated parameters?

LSD-SLAM questions:

  • Did you only use Keyframes, or poses for all the frames?

Thank you

How to visualize the map with color?

Hello, in the video below:
https://www.youtube.com/watch?t=30&v=HlBmq70LKrQ

There are two versions of map, the first is the one contains dots in red and black
http://imgur.com/ra566zO

And another version is colorful
http://imgur.com/xn2RG5o

I've tried to find possible topic publishing the colorful version, but I haven't found one. (I've seen /ORB_SLAM/Map_array, but it's a empty topic) Also, I've checked ROS parameters and found nothing related. So I can only get the first version in my Rviz.
http://imgur.com/UDXhtgO

Should I modify the code or do something else to see the colorful version?
Thanks in advance.

What's the reason for picking ORB?

I would like to know the reason for picking ORB as the feature descriptor.
There are other rotational and scale invariant binary descriptors.
If I switch ORB to FREAK (or BRISK), will the system work equally well?

Is there a restart key available ?

When ORB_SLAM is running, what should I do if I want to go back to the initialization stage ?

Is there a key for restarting the program ?

Missing include DBoW2

In Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h

I needed to add:

include <limits>

to fix:
error: ‘numeric_limits’ is not a member of ‘std’

rosrun image_view

Hi all,
I had a black window when I did:
rosrun image_view image_view image:=/ORB_SLAM/Frame _autosize:=true, any suggestion please.
PS: I'm using a Logitech camera. My frames.pdf contains "no tf data recieved".
Thank you a lot.

Subpixel accuracy for tracking

Did you do any subpixel refinement on the tracked features?
FAST does not give subpixel detection and descriptor matching also does not do subpixel refinement.
(Unless you did it somewhere in the code and I didn't find it)
Will this cause the odometry to drift more compared to other template-based tracking methods (KLT or ESM)

How to create ORB vocabulary from dataset?

While I have managed to get your example file to work, I struggle with providing an ORB vocabulary from my own dataset. I have a set of image files converted to a .bag file.

How do you extract ORB features from an image dataset to provide to ORB-SLAM?

Different error buildiing ORB-SLAM

Installation: Ubuntu 14.04
ROS: Indigo
Output of $ROS_PACKAGE_PATH=/home/josh/Desktop/ORB-SLAM:/opt/ros/indigo/share:/opt/ros/indigo/stacks

Error occurs in building ORB-SLAM at step 3-6. Output of error:

josh@josh-Surface-with-Windows-8-Pro:~/Desktop/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=RELEASE
[rosbuild] Building package ORB_SLAM
[rosbuild] Error from directory check: /opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py /home/josh/Desktop/ORB_SLAM
1
Traceback (most recent call last):
File "/opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py", line 46, in
raise Exception
Exception
CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/private.cmake:102 (message):
[rosbuild] rospack found package "ORB_SLAM" at "", but the current
directory is "/home/josh/Desktop/ORB_SLAM". You should double-check your
ROS_PACKAGE_PATH to ensure that packages are found in the correct
precedence order.
Call Stack (most recent call first):
/opt/ros/indigo/share/ros/core/rosbuild/public.cmake:177 (_rosbuild_check_package_location)
CMakeLists.txt:4 (rosbuild_init)

-- Configuring incomplete, errors occurred!
See also "/home/josh/Desktop/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".

Further research on answers.ros.org indicates that using CMAKE isn't supported, given the error above. (source: http://answers.ros.org/question/65801/ros-inside-part-of-a-c-project/ )

Does anyone have any sort of answers how to make this work? This would be an impressive project to get up and running.

Sincerely,
Josh Conway

Hi, how to access the pose of the camera?

Hello, @raulmur
I am sorry to trouble you but I am facing some problems while porting your code to an arm board.

The 1st problem is that I have no idea how to get the camera pose.
I have seen into your code, and found following lines
cv::Mat Rwc = mCurrentFrame.mTcw.rowRange(0,3).colRange(0,3).t();
cv::Mat twc = -Rwc*mCurrentFrame.mTcw.rowRange(0,3).col(3);
is Rwcthe rotation matrix of the camera in world coordinate system?
and
I am also wondering how to access the (x,y,z) coordinate of camera in world coordinate system?
is it twc or mCurrentFrame.mTcw.rowRange(0,3).col(3)?

The second problem is that how to remove the viewer and rviz of the program. Because I only need the pose of the camera. or how do I get to know all the topics with orbslam?

I am looking forward to hearing from you
best wishes to you

                                                                                                             yours, sincerely
                                                                                                             Li Qile

Tracking fails half way for KITTI seq 00

Environment: ROS indigo, Ubuntu Trusty 14.04 on VM ware workstation, CPU i7 2.4GHz, 8G RAM
Dataset: KITTI\data_odometry_gray\dataset\sequences\00\image_0
Bag file: created with rosrun BagFromImages BagFromImages IMAGE_PATH 10.0 image_0.bag
settings.yaml:
%YAML:1.0

Camera Parameters. Adjust them!

Camera calibration parameters (OpenCV)

Camera.fx: 718.856
Camera.fy: 718.856
Camera.cx: 607.1928
Camera.cy: 185.2157

Camera distortion paremeters (OpenCV) --

Camera.k1: 0
Camera.k2: 0
Camera.p1: 0.0
Camera.p2: 0.0

Camera frames per second

Camera.fps: 10.0

Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)

Camera.RGB: 1

--------------------------------------------------------------------------------------------

Changing the parameters below could seriously degradate the performance of the system

ORB Extractor: Number of features per image

ORBextractor.nFeatures: 2000

ORB Extractor: Scale factor between levels in the scale pyramid

ORBextractor.scaleFactor: 1.2

ORB Extractor: Number of levels in the scale pyramid

ORBextractor.nLevels: 8

ORB Extractor: Fast threshold (lower less restrictive)

ORBextractor.fastTh: 20

ORB Extractor: Score to sort features. 0 -> Harris Score, 1 -> FAST Score

ORBextractor.nScoreType: 1

Constant Velocity Motion Model (0 - disabled, 1 - enabled [recommended])

UseMotionModel: 1

In execution, the tracking always failed and the relocalization module was invoked regardless of velocity motion model and nScoreType.
Do you have any idea why the result as shown on the ORB-SLAM project website cannot be reproduced?

ORB-SLAM dies after 5 minutes

Hi,

when using ORB-SLAM in our applications (onboard an MAV) everything works fine until about 5 minutes into flight. Usually around that time, the ORB-SLAM process dies. Is there a way to see/check why it dies? Is it a memory issue? It just gives us exit code -9

only seeing polygons in rviz

Hi guys,

ORB-SLAM itself seems to be working fine (features tracking well), but I can't visualize the 3D map in rviz at all. I see very weird shapes/polygons instead. Here's a screenshot. Any help would be appreciated.

Thanks,
Sid

ORB_SLAM initialization failure on indigo ununto 14.04

Hello
i have remapped the camera topics correctly . but getiing this error
RB-SLAM Copyright (C) 2014 Raul Mur-Artal
This program comes with ABSOLUTELY NO WARRANTY;
This is free software, and you are welcome to redistribute it
under certain conditions. See LICENSE.txt.

Loading ORB Vocabulary. This could take a while.

Wrong path to vocabulary. Path must be absolut or relative to ORB_SLAM package directory.
[orb_slam-3] process has died [pid 30997, exit code 1, cmd /home/farhan/catkin_ws/devel/lib/orb_slam/orb_slam Data/ORBvoc.yml Data/Settings.yaml /camera/image_raw:=/creative_cam/image_color __name:=orb_slam __log:=/home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3.log].
log file: /home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3*.log

Thanks
Farhan

Dictonary size

1.What is the size of visual dictionary you are using ,i mean how may nodes are there , I haven't seen such a good re localization in any other SLAM algorithm.
2. What does motion model in setting.yaml file tell. Are you using EKF with constant velocity model.

thanks

test on KITTI VO seq 00: Not initialized -> Trying to initialize

Dear all,

I had installed ORB_SLAM and run the example successfully. Now I turn into having a test on KITTI VO seq 00, like authors did in their paper.
But when I start ORB_SLAM on KITTI VO seq 00, I just receive "Not initialized -> Trying to initialize", and seeing street-view images with green edges on it in the /ORB_SLAM/Frame, please the link:
https://www.dropbox.com/s/0oi3iu3nqzwpv1h/47.png?dl=0

Could someone give me some hints?

Thanks in advance~

Milton

scale changed after turnning

hi, @raulmur
thanks for your sharing your orb-slam code .sorry to trouble you but I have a trouble while tracking
I firstly go straight towards north and then turn to east
orb-slam can calculate the degrees very well, but the scale of distance changed obviously after turning to east.
the walking distance towards north and towards east are nealy the same, but on the rviz, the distance to east is obviously shorter than that to north
could you give me some suggestions ?
thanks and best

Error during make Indigo

Hi,
I got that error message during the ORB_SLAM make. I've done everything like in the manual.

Regards,
husdo

/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::~TemplatedVocabulary() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:510:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘void DBoW2::TemplatedVocabulary<TDescriptor, F>::createScoringObject() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:420:23: required from ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::TemplatedVocabulary(int, int, DBoW2::WeightingType, DBoW2::ScoringType) [with TDescriptor = cv::Mat; F = DBoW2::FORB]’
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:446:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
[ 11%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/Tracking.cc.o
[ 16%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LocalMapping.cc.o
[ 22%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LoopClosing.cc.o
[ 27%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputeKeyPoints(std::vectorstd::vector<cv::KeyPoint >&)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:606:63: error: ‘FAST’ was not declared in this scope
FAST(cellImage,cellKeyPoints[i][j],fastTh,true);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:615:34: error: ‘ORB’ has not been declared
if( scoreType == ORB::HARRIS_SCORE )
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:682:17: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keysCell,nToRetain[i][j]);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:698:13: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keypoints,nDesiredFeatures);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::operator()(cv::InputArray, cv::InputArray, std::vectorcv::KeyPoint&, cv::OutputArray)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:759:82: error: ‘GaussianBlur’ was not declared in this scope
GaussianBlur(workingMat, workingMat, Size(7, 7), 2, 2, BORDER_REFLECT_101);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputePyramid(cv::Mat, cv::Mat)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:78: error: ‘INTER_LINEAR’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:90: error: ‘resize’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:802:80: error: ‘INTER_NEAREST’ was not declared in this scope
resize(mvMaskPyramid[level-1], mvMaskPyramid[level], sz, 0, 0, INTER_NEAREST);
^
make[2]: *** [CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o] Error 1
make[1]: *** [CMakeFiles/ORB_SLAM.dir/all] Error 2
make: *** [all] Error 2

Error Make Indigo

Hello,

I am currently using Ubuntu 14.04 + ROS Indigo. I have cloned the ORB_SLAM repository to my catkin workspace and all of the necessary dependencies are installed, but when I launch "ExampleGroovyHydro.launch" then I get the following error:

ERROR: cannot launch node of type [ORB_SLAM/ORB_SLAM]: can't locate node [ORB_SLAM] in package [ORB_SLAM]

And indeed I cannot find the ORB_SLAM node within the package. Also when I run the following instruction: rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE, I get:

[rosrun] Couldn't find executable named ORB_SLAM below /home/nils/catkin_ws/src/ORB_SLAM.

Does anyone know how I could solve this error? I would appreciate it a lot. Regards.

how can i run this on ROS(jade)?

when i install ORB_SLAM , i have some problem.
/******************************************************************************
Failed to invoke /opt/ros/jade/bin/rospack deps-manifests ORB_SLAM-master
[rospack] Error: package 'ORB_SLAM-master' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'
******************************************************************************/
how can i solve this problem . Thanks.

camera poses of all the frames

Hi Raul,

Thanks for sharing the code.
I want to save the camera pose of all the frames, not only for the key frames. Could you please tell how to do this?

thanks,

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.