Giter Club home page Giter Club logo

rovio's Introduction

README

This repository contains the ROVIO (Robust Visual Inertial Odometry) framework. The code is open-source (BSD License). Please remember that it is strongly coupled to on-going research and thus some parts are not fully mature yet. Furthermore, the code will also be subject to changes in the future which could include greater re-factoring of some parts.

Video: https://youtu.be/ZMAISVy-6ao

Papers:

Please also have a look at the wiki: https://github.com/ethz-asl/rovio/wiki

Install without opengl scene

Dependencies:

#!command

catkin build rovio --cmake-args -DCMAKE_BUILD_TYPE=Release

Install with opengl scene

Additional dependencies: opengl, glut, glew (sudo apt-get install freeglut3-dev, sudo apt-get install libglew-dev)

#!command

catkin build rovio --cmake-args -DCMAKE_BUILD_TYPE=Release -DMAKE_SCENE=ON

Euroc Datasets

The rovio_node.launch file loads parameters such that ROVIO runs properly on the Euroc datasets. The datasets are available under: http://projects.asl.ethz.ch/datasets/doku.php?id=kmavvisualinertialdatasets

Further notes

  • Camera matrix and distortion parameters should be provided by a yaml file or loaded through rosparam
  • The cfg/rovio.info provides most parameters for rovio. The camera extrinsics qCM (quaternion from IMU to camera frame, Hamilton-convention) and MrMC (Translation between IMU and Camera expressed in the IMU frame) should also be set there. They are being estimated during runtime so only a rough guess should be sufficient.
  • Especially for application with little motion fixing the IMU-camera extrinsics can be beneficial. This can be done by setting the parameter doVECalibration to false. Please be carefull that the overall robustness and accuracy can be very sensitive to bad extrinsic calibrations.

rovio's People

Contributors

alexmillane avatar andreasjaeger avatar bloesch avatar burrimi avatar devbharat avatar fmina avatar grafue avatar helenol avatar kartikmohta avatar michaelpantic avatar nicolov avatar raghavkhanna avatar zacharytaylor avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

rovio's Issues

Clarify License

First, thanks for posting the video, paper, and code!

If you could, can you clarify the license that this code is available under? The README.md indicates open-source and the package.xml indicates Proprietary.

How to set parameters of rovio with the result of kalibr

Hi,
I've managed to get a acceptable calibration result of my smartphone with kalibr. But I'm confused with setting the parameters of rovio in the info file. As @bloesch mentioned in #25 , some parameters are critical. For example, qCM, I calculate it from the transformation matrix T_ic, am I right? Any other important parameters should be taken into consideration?

Regards

Sign of the H_ matrix

The formula used for computing the H matrix in jacState:

F = -c_J_*featureOutputJac_.template block<2,mtState::D_>(0,0);

and in generateCandidates

canditateGenerationH_  = c_J_*featureOutputJac_.template block<2,mtState::D_>(0,0);

I can't see why the sign is different.

Scene window is still

The scene window is still and shows nothing but the plane grid and the camera axis at the origin.
Everything else is workable, the node publishes odometry updates, the scene calls the render function.
Would you like to share cues on how to investigate that?
I run rovio on Ubuntu 14.04 with OpenCV 3.0.0.

Publishing visualiser output over ROS

I'm running Rovio on a small MAV, with a high-bandwidth link to the GCS, and it is preferable to have the visualiser window display published over a ROS topic. I can already turn off the visualiser window using the requisite param, but being able to see Rovio's output in e.g Rviz (over the network) on the ground would be very useful.

Could someone give me some pointers on this? I'm willing to implement it and send in a PR.

Rovio always fails to compile the first time on a new computer

This is something I have seen on a few systems now and am unsure what is causing it. If you create a new catkin_ws and build rovio it will always fail the first time with the error:

Errors << rovio:make /home/zac/catkin_ws/logs/rovio/build.make.000.log
In file included from /home/zac/catkin_ws/src/rovio/src/rovio_node.cpp:40:0:
/home/zac/catkin_ws/src/rovio/include/rovio/RovioNode.hpp:50:34: fatal error: rovio/SrvResetToPose.h: No such file or directory
#include <rovio/SrvResetToPose.h>
^
compilation terminated.
make[2]: *** [CMakeFiles/rovio.dir/src/rovio_node.cpp.o] Error 1
make[1]: *** [CMakeFiles/rovio.dir/all] Error 2
make: *** [all] Error 2
cd /home/zac/catkin_ws/build/rovio; catkin build --get-env rovio | catkin env -si /usr/bin/make --jobserver-fds=6,7 -j; cd -

However, if the build is immediately rerun everything builds fine and the error never appears again.

How do I properly forward imu data to rovio?

Hi,

I am getting stuck in the configuration stage of rovio trying to forward imu data from my PX4 via mavros to Rovio. I have tried many different configurations in my launch file but I am having no success. I get no indicator that imu data is being forwarded in RQT_Graph. I have Mavros running in another terminal publishing raw imu data to /mavros/imu/data_raw.

Here is my terminal output
`uav_master@MobileGCS:~/vision_ws/src/rovio/launch$ roslaunch rovio_node.launch
... logging to /home/uav_master/.ros/log/b123a4de-5803-11e6-af86-204747c2636d/roslaunch-MobileGCS-17397.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://MobileGCS:43650/

SUMMARY

PARAMETERS

  • /cam0/camera_frame_id: usb_cam
  • /cam0/image_height: 480
  • /cam0/image_width: 640
  • /cam0/io_method: mmap
  • /cam0/pixel_format: yuyv
  • /cam0/video_device: /dev/video1
  • /image_view/autosize: True
  • /rosdistro: indigo
  • /rosversion: 1.11.20
  • /rovio/camera0_config: /home/uav_master/...
  • /rovio/filter_config: /home/uav_master/...
    This error is only thrown when I use the following:

    NODES
    /
    cam0 (usb_cam/usb_cam_node)
    image_view (image_view/image_view)
    rovio (rovio/rovio_node)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[cam0-1]: started with pid [17415]
process[image_view-2]: started with pid [17416]
process[rovio-3]: started with pid [17417]
[ INFO] [1470070253.664638694]: Using transport "raw"
Set Camera Matrix to:
958.1 0 274.394
0 967.598 295.947
0 0 1
Set distortion parameters (Radtan) to: k1(0.103935), k2(-0.25267), k3(0), p1(0.021415), p2(-0.014961)
Testing Prediction
==== Test successful (5.91687e-08) ====
==== Test successful (5.54897e-08) ====
Testing cameraOutputCF
==== Test successful (6.36613e-08) ====
Testing imuOutputCF
==== Test successful (1.54622e-08) ====
Testing attitudeToYprCF
==== Test successful (4.66041e-08) ====
Testing transformFeatureOutputCT
==== Test successful (1.02961e-08) ====
Testing LandmarkOutputImuCT
==== Test successful (2.26763e-08) ====
Testing FeatureOutputReadableCT
==== Test successful (1.10557e-08) ====
Testing pixelOutputCT (can sometimes exhibit large absolut errors due to the float precision)
==== Model jacInput Test failed: 591.941 is larger than 1 at row 1(def_0) and col 0() ====
649694 649102
Testing zero velocity update
==== Test successful (3.04311e-10) ====
==== Test successful (1.39778e-10) ====
[ INFO] [1470070253.890565167]: using default calibration URL
[ INFO] [1470070253.890691757]: camera calibration URL: file:///home/uav_master/.ros/camera_info/head_camera.yaml
[ INFO] [1470070253.891664723]: Starting 'head_camera' (/dev/video1) at 640x480 via mmap (yuyv) at 30 FPS
-- Filter: Initializing using accel. measurement ...
-- Filter: Initialized at t = 1470070253.92
[ WARN] [1470070254.176357929]: unknown control 'white_balance_temperature_auto'

[ WARN] [1470070254.179373808]: unknown control 'focus_auto'

OpenCV Error: Assertion failed (scn == 1 && (dcn == 3 || dcn == 4)) in cvtColor, file /build/buildd/opencv-2.4.8+dfsg1/modules/imgproc/src/color.cpp, line 3789
terminate called after throwing an instance of 'cv::Exception'
what(): /build/buildd/opencv-2.4.8+dfsg1/modules/imgproc/src/color.cpp:3789: error: (-215) scn == 1 && (dcn == 3 || dcn == 4) in function cvtColor

[rovio-3] process has died [pid 17417, exit code -6, cmd /home/uav_master/vision_ws/devel/lib/rovio/rovio_node /imu0:=/mavros/imu/data __name:=rovio __log:=/home/uav_master/.ros/log/b123a4de-5803-11e6-af86-204747c2636d/rovio-3.log].
log file: /home/uav_master/.ros/log/b123a4de-5803-11e6-af86-204747c2636d/rovio-3*.log
`

This error is only thrown when I use the following:
<remap from="/imu0" to="/mavros/imu/data"/>

If I use:
`

`

I then I get:
`uav_master@MobileGCS:~/vision_ws/src/rovio/launch$ roslaunch rovio_node.launch
... logging to /home/uav_master/.ros/log/b123a4de-5803-11e6-af86-204747c2636d/roslaunch-MobileGCS-17712.log
Checking log directory for disk usage. This may take awhile.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://MobileGCS:49584/

SUMMARY

PARAMETERS

  • /cam0/camera_frame_id: usb_cam
  • /cam0/image_height: 480
  • /cam0/image_width: 640
  • /cam0/io_method: mmap
  • /cam0/pixel_format: yuyv
  • /cam0/video_device: /dev/video1
  • /image_view/autosize: True
  • /rosdistro: indigo
  • /rosversion: 1.11.20
  • /rovio/cam0_topic_name: /cam0/image_raw
  • /rovio/imu0_topic_name: /imu/data_raw

NODES
/
cam0 (usb_cam/usb_cam_node)
image_view (image_view/image_view)
rovio (rovio/rovio_node)

ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[cam0-1]: started with pid [17730]
process[image_view-2]: started with pid [17731]
process[rovio-3]: started with pid [17732]
[ INFO] [1470070687.099686439]: Using transport "raw"
Set Camera Matrix to:
958.1 0 274.394
0 967.598 295.947
0 0 1
Set distortion parameters (Radtan) to: k1(0.103935), k2(-0.25267), k3(0), p1(0.021415), p2(-0.014961)
Testing Prediction
==== Test successful (5.91687e-08) ====
==== Test successful (5.54897e-08) ====
Testing cameraOutputCF
==== Test successful (6.36613e-08) ====
Testing imuOutputCF
==== Test successful (1.54622e-08) ====
Testing attitudeToYprCF
==== Test successful (4.66041e-08) ====
Testing transformFeatureOutputCT
==== Test successful (1.02961e-08) ====
Testing LandmarkOutputImuCT
==== Test successful (2.26763e-08) ====
Testing FeatureOutputReadableCT
==== Test successful (1.10557e-08) ====
Testing pixelOutputCT (can sometimes exhibit large absolut errors due to the float precision)
==== Model jacInput Test failed: 591.941 is larger than 1 at row 1(def_0) and col 0() ====
649694 649102
Testing zero velocity update
==== Test successful (3.04311e-10) ====
==== Test successful (1.39778e-10) ====
[ INFO] [1470070687.321380506]: using default calibration URL
[ INFO] [1470070687.321540739]: camera calibration URL: file:///home/uav_master/.ros/camera_info/head_camera.yaml
[ INFO] [1470070687.322607110]: Starting 'head_camera' (/dev/video1) at 640x480 via mmap (yuyv) at 30 FPS
[ WARN] [1470070687.607545284]: unknown control 'white_balance_temperature_auto'

[ WARN] [1470070687.613731130]: unknown control 'focus_auto'
`

I was using this https://github.com/ethz-asl/rovio/issues/53
source as an indicator that I am still do not have not the imu configuration properly set.

Thanks for any advice.

Better stability for fast rotations

Hi, I'm having problems with keeping tracking during fast movements. Say my system faces one way and suddenly I rotate by 90 or 180 degrees in any direction so that no current feature is visible anymore and new features have been found. This always results in tracking failure and I have to restart the filter.
Are there any parameters I can tweak, besides having good hardware?

Interesting new stereo cam + IMU offering

I've just found this interesting off-the-shelf product with a global shutter stereo camera and integrated IMU. Now that the the VI sensor is no longer for sale, it could be an easy solution for rovio, at $249.

link

While the cameras are synchronized, the spec sheet doesn't say anything about the IMU sync. However, since everything is delivered over the same USB3 connection, you'd think the time delay would be pretty stable, and thus correctable.

Cleaning the rovio node output

The output's of the rovio node is not very tidy and is currently computing a lot of different outputs. My sugestion:

  • tf-output: world->imu->cameras, eventually extended with frames involving the external pose measurements (e.g. GPS)
  • PoseWithCovarianceStamped for world to IMU: this is the one which should be employed for feedback control.
  • TwistWithCovarianceStamped for the velocity and rotational rate of the IMU frame

I will also add output for IMU bias, IMU-camera extrinsics and features. However all other topics will be removed (transforms/odometry/poses).

Any feedback/comments/whishes?

Compilation Error with Opencv3, no member named 'mpCamera'

I had to switch my project over to a new laptop which supports opengl ES 3.0. In doing so, I had to start from scratch in gathering the dependencies and supporting packages. The other difference between my successful compilation on my old laptop and new laptop is that I decided to use OpenCV3 on my new laptop with all of the extra modules installed.. I had to make modifications to point the Eigen dependency to Eigen3 but that is about it as far as changes. The log is attached.

rovio.build.log.txt

live rovio diverging from the start

Hi,

when I run rovio with the provided test data (Machinehall + the provided yaml configs for the camera) everything works very good and precise.
I'm still waiting for my global shutter camera, so I tried to make rovio run with my logitech webcam HD Pro C920 (+tinkerforge imu brick 2.0). But right from the start rovio is totally lost. Please see the video below for a demonstration. I calibrated the system using kalibr with the usual steps. To track down the problem I also changed some parameters (like imu-cam orientation) to deliberately wrong values, but no change. Even when using a dummy-imu-node that publishes only 0 values, the result is the same.

https://www.youtube.com/watch?v=72EdG0ik-eY

anybody encountered this and can point me in some general direction? of course i can provide a bag file with some sample data....
thanks!

rovio convention for camera coordinate system

I'm trying to get rovio to run on video/imu data recorded on a Google Tango (handheld device, using RGB camera). It's working on the Euroc data set, but not on the Tango data. I believe the camera calibration is o.k., but I'm not sure I got the IMU-to-camera transformation right.

What convention are you using for the camera coordinate system? The standard one (x to the right , y down, z along viewing direction)?

When looking at the imu data:

linear_acceleration:
x: -0.057368904
y: 10.583193
z: 3.0418723

it appears that y is pointing up.
Is there a way to print out what the gravitational vector "g" looks like in camera coordinates? Basically any method that assures that the transform is correct?
Here is a video of rovio in action, but not quite working yet.

Any help is appreciated!

Running on multiple cores on embedded linux

So I have rovio finally running on IMX6 on embedded linux. It has a quad core 1GHz processor, but rovio uses only 1 core, and I get 3-4 frames per second.

I see that currently the CMake file disables OpenMP by default, has someone tried it running parallelly ?

Extrinsic messgae

I suppose this in rovio_node.hpp

ros::Publisher pubExtrinsics_[mtState::nMax_];

be

ros::Publisher pubExtrinsics_[mtState::nCam_];

? Not important, but anyways.

Pose Update from Tag Detector

I have been trying to include the raw pose obtained from tag detector into rovio. Both Tag based position and attitude are being used, and pose feedback to features in rovio is enabled. The problem is that even though the raw tag based position follows the ground truth pretty well (see plots below), the rovio output does not converge to it, even after multiple measurements.

withtag_npos2_pos

My rovio.info file looks like:
nPose was set to 1.

PoseUpdate
{
    UpdateNoise
    {
        pos_0 0.5;                          X-Covariance of linear pose measurement update [m^2]
        pos_1 0.5;                          Y-Covariance of linear pose measurement update [m^2]
        pos_2 1e-7;                         Z-Covariance of linear pose measurement update [m^2]
        att_0 0.5;                          X-Covariance of rotational pose measurement update [rad^2]
        att_1 0.5;                          Y-Covariance of rotational pose measurement update [rad^2]
        att_2 0.5;                          Z-Covariance of rotational pose measurement update [rad^2]
    }
    init_cov_IrIW 10;                           Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qWI 10;                                Covariance of initial pose between inertial frames, rotational part [rad^2]
    init_cov_MrMV 10;                           Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qVM 10;                                Covariance of initial pose between inertial frames, rotational part [rad^2]
    pre_cov_IrIW 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qWI 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    pre_cov_MrMV 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qVM 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    MahalanobisTh0 12.6511204;                  Mahalonobis distance treshold of pose measurement
    doVisualization true;                       Should the measured poses be vizualized
    enablePosition true;                        Should the linear part be used during the update
    enableAttitude true;                        Should the rotation part be used during the update (e.g. set to fals eif feeding GPS-measurement)
    noFeedbackToRovio false;                        By default the pose update is only used for aligning coordinate frame. Set to false if ROVIO should benefit frome the posed measurements.
    doInertialAlignmentAtStart true;            Should the transformation between I and W be explicitly computed and set with the first pose measurement.
    timeOffset 0.0;                             Time offset added to the pose measurement timestamps
    qVM_x 0;                                    X-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_y 0;                                    Y-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_z 0;                                    Z-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_w 1;                                    W-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    MrMV_x 0;                                   X-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_y 0;                                   Y-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_z 0;                                   Z-entry of vector representing IMU to reference body coordinate frame of pose measurement
    qWI_x 0;                                    X-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_y 0;                                    Y-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_z 0;                                    Z-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_w 1;                                    W-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    IrIW_x 0;                                   X-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_y 0;                                   Y-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_z 0;                                   Z-entry of vector representing World to reference inertial coordinate frame of pose measurement
}

The same configuration when used with not raw tag based pose, but with ground truth from gazebo, I get excellent results (even with these horribly high covariance values):

Result with sending ground truth position to rovio:
gt_used_pos

Essentially, the only difference I see between the position updates from the tag versus that from ground truth is not the position value itself (although the tag based position is slightly noisy), but the time delay of the detector actually processing the image to extract the pose(like 300 ms).

I also constantly get the warning:

Warning: included update measurements before safeTime

when including tag based pose which I don't get when using ground truth.

Also, the visualization of ground truth on the 'scene' window of ROVIO is not smooth when using tag based pose (it momentarily moves then stays put even as rovio's CF moves around), but very smooth (and agreeing with ROVIO's CF) when using ground truth pose.

Packing use of visualization_msgs with MAKE_SCENE

I am porting rovio for embedded linux using yocto and meta-ros, and it seems wasteful to have rovio sending (and requiring to be built with) visualization_msgs when it's not built with scene, indeed on linux built without support for GUI.

I can manually edit it out, but I was wondering if it made sense in general to send out visual marker messages only if rovio is built with scene.

file INSTALL cannot copy file 'lightweight_filtering/cmake/FindLWF.cmake'

I noticed that after running catkin build --cmake-args -DCMAKE_BUILD_TYPE=Release -DMAKE_SCENE=ON (or just raw catkin build), I get an install error bellow due to restricted permissions. sudo prefix solves it but that is not recommended. Is this supposed to happen?

nuno@nuno-P17SM:~/AIMAV_Project$ catkin build --cmake-args -DCMAKE_BUILD_TYPE=Release -DMAKE_SCENE=ON
--------------------------------------------------------------------------------
Profile:                     default
Extending:             [env] /home/nuno/AIMAV_Project/devel:/home/nuno/AIMAV_Project/devel_isolated/rovio:/home/nuno/AIMAV_Project/devel_isolated/kindr:/home/nuno/PINCAir/src/visim_drone/ws/devel:/opt/ros/jade
Workspace:                   /home/nuno/AIMAV_Project
Source Space:       [exists] /home/nuno/AIMAV_Project/src
Build Space:        [exists] /home/nuno/AIMAV_Project/build
Devel Space:        [exists] /home/nuno/AIMAV_Project/devel
Install Space:     [missing] /home/nuno/AIMAV_Project/install
DESTDIR:                     None
--------------------------------------------------------------------------------
Isolate Develspaces:         False
Install Packages:            True
Isolate Installs:            False
--------------------------------------------------------------------------------
Additional CMake Args:       -DCMAKE_BUILD_TYPE=Release -DMAKE_SCENE=ON
Additional Make Args:        None
Additional catkin Make Args: None
Internal Make Job Server:    True
--------------------------------------------------------------------------------
Whitelisted Packages:        None
Blacklisted Packages:        None
--------------------------------------------------------------------------------
Workspace configuration appears valid.
-------------------------------------------------------------------------------- 
Found '2' packages in 0.0 seconds. 
Starting ==> kindr                                                              
Finished <== kindr [ 0.8 seconds ]                                              
Starting ==> rovio                                                              

[rovio] ==> '/home/nuno/AIMAV_Project/build/rovio/build_env.sh /usr/bin/make install' in '/home/nuno/AIMAV_Project/build/rovio'
[ 62%] Built target rovio
[ 75%] Built target feature_tracker_node
[ 87%] Built target rovio_node
[100%] Built target rovio_rosbag_loader
Install the project...
-- Install configuration: "Release"
-- Installing: /home/nuno/AIMAV_Project/install/_setup_util.py
-- Installing: /home/nuno/AIMAV_Project/install/env.sh
-- Installing: /home/nuno/AIMAV_Project/install/setup.bash
-- Installing: /home/nuno/AIMAV_Project/install/setup.sh
-- Installing: /home/nuno/AIMAV_Project/install/setup.zsh
-- Installing: /home/nuno/AIMAV_Project/install/.rosinstall
-- Installing: /home/nuno/AIMAV_Project/install/lib/pkgconfig/rovio.pc
-- Installing: /home/nuno/AIMAV_Project/install/share/rovio/cmake/rovioConfig.cmake
-- Installing: /home/nuno/AIMAV_Project/install/share/rovio/cmake/rovioConfig-version.cmake
-- Installing: /home/nuno/AIMAV_Project/install/share/rovio/package.xml
-- Up-to-date: /home/nuno/AIMAV_Project/install/_setup_util.py
-- Up-to-date: /home/nuno/AIMAV_Project/install/env.sh
-- Up-to-date: /home/nuno/AIMAV_Project/install/setup.bash
-- Up-to-date: /home/nuno/AIMAV_Project/install/setup.sh
-- Up-to-date: /home/nuno/AIMAV_Project/install/setup.zsh
-- Up-to-date: /home/nuno/AIMAV_Project/install/.rosinstall
-- Installing: /home/nuno/AIMAV_Project/install/lib/pkgconfig/lightweight_filtering.pc
-- Installing: /home/nuno/AIMAV_Project/install/share/lightweight_filtering/cmake/lightweight_filteringConfig.cmake
-- Installing: /home/nuno/AIMAV_Project/install/share/lightweight_filtering/cmake/lightweight_filteringConfig-version.cmake
-- Installing: /home/nuno/AIMAV_Project/install/share/lightweight_filtering/package.xml
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/ModelBase.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/TestClasses.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/Update.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/PropertyHandler.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/SigmaPoints.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/Prediction.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/common.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/FilterBase.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/CoordinateTransform.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/State.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/OutlierDetection.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/FilterState.hpp
-- Installing: /home/nuno/AIMAV_Project/install/include/LWF/include/lightweight_filtering/GIFPrediction.hpp
-- Installing: /usr/share/cmake-3.2/Modules/FindLWF.cmake
CMake Error at lightweight_filtering/cmake_install.cmake:157 (file):
  file INSTALL cannot copy file
  "/home/nuno/AIMAV_Project/src/rovio/lightweight_filtering/cmake/FindLWF.cmake"
  to "/usr/share/cmake-3.2/Modules/FindLWF.cmake".
Call Stack (most recent call first):
  cmake_install.cmake:134 (include)

make: *** [install] Error 1
[rovio] <== '/home/nuno/AIMAV_Project/build/rovio/build_env.sh /usr/bin/make install' failed with return code '2'

Failed   <== rovio [ 1 minute and 24.4 seconds ]                                
[build] There were '1' errors:                                                  

Failed to build package 'rovio' because the following command:

# Command to reproduce:
cd /home/nuno/AIMAV_Project/build/rovio && /home/nuno/AIMAV_Project/build/rovio/build_env.sh /usr/bin/make install; cd -

# Path to log:
cat /home/nuno/AIMAV_Project/build/build_logs/rovio.log

Exited with return code: 2 

Just a side nore: I'm with an Ubuntu 14.04.3 (3.19.0-43-generic kernel) with ROS Jade laptop.

Compatibility issue with ubuntu 16.04

Hi! I tried compiling ROVIO on ubuntu 16.04 with ROS jade, got over 100 build errors another person compiled older version of boost,opencv etc to see if he can resolve such errors with no luck. Anyone else facing similar issues?

SrvResetToPose.h missing

RovioNode.hpp requires SrvResetToPose.h, but I can't find this header file anywhere.
Am I missing something?

fatal error: rovio/SrvResetToPose.h: No such file or directory
#include <rovio/SrvResetToPose.h>

Reset/Init Rovio with initial external pose

What would be the best way to implement implement reseting/initialising Rovio with a pose estimate from an external filter?

I want to init Rovio in the beginning with an initial pose, and then if it ever loses track/diverges, reset Rovio to the last know good pose (again from an external topic).

What is the way to do this best? Including detection of divergence?

rovio with high precision

Maybe someone can help me out: My setting is the following: I need high precision orientation in a relatively small space (maybe 1mยณ). Are there any special parameters I can tweak to yield higher precision, i.e. what parameters are the most influential for precision?
Thank you!

Better feature initialization

I was wondering if it makes sense to initialize(or look for) new features in specific 'sections' of the image based on current pose uncertainty of the camera to gain most information with least amount of new features, rather than spreading many features everywhere. Surely, based on current pose uncertainty of the camera and locations of currently tracked features, new features in certain directions will yeild more information to better localize the camera than other directions ? @bloesch

Reference

you should put a reference to the paper in the README for citation ;)

error: GLSL ES 3.00 is not supported

I have a problem when I ran the rovio_node.launch. It shows,
Error compiling shader type 35633: '0:1(10): error: GLSL ES 3.00 is not supported. Supported versions are: 1.10, 1.20, and 1.00 ES
[rovio-2] process has died [pid 4556, exit code 1, cmd /home/uav/vision_ws/devel/lib/rovio/rovio_node __name:=rovio __log:=/home/uav/.ros/log/9c1c037e-5411-11e6-bb65-002170f29c86/rovio-2.log].
log file: /home/uav/.ros/log/9c1c037e-5411-11e6-bb65-002170f29c86/rovio-2*.log
^C[rosout-1] killing on exit
Does anyone know how to solve this problem?

Add FOV distortion model support for FishEye Lens

Hi bloesch
I am using one FishEye Lens camera, then I had calibrated the extrinsic between IMU and Camera, and also the intrinsic of Camera using the FOV model. So I would like to let rovio also support FOV.

Now I had added supporting functions for FOV model to the "Camera::distort" function. But from the visualied GUI frame, I see the same feature point keeps increasing the number on it, so I think the feature tracking is in bad state, right?

As I understand, Camera::bearingToPixel is doing projection(3D -> 2D), and Camera::pixelToBearing is doing back projection(2D -> 3D), how could these two functions utilize the same "Camera::distort" function? I don't quite understand the logic in Camera::pixelToBearing, especially the variable "du" in the for loop, could you help giving some guidance on this item?

Thanks
Gino

Position in inertial-z (altitude) seems to drift

When executing vertical motion with the camera facing down, rovio's z estimate seems to drift with respect to gazebo's ground truth with time. It happens more severely when the vertical motion is executed slowly (the slower the worse) and better when accelerations are higher.

I can observe on the Rviz output, that when rising slowly, rather than moving the IMU-Camera position estimate in the upword direction, it adjusts the tracked feature patch's location (then all start condensing closer). This doesn't happen when the vertical accelerations are higher.

Plots below:
Slow vertical motion:
vel_test_z_slow_3d
vel_test_z_slow_position
vel_test_z_slow_linear_velocity

Fast vertical motion:
vel_test_z_3d
vel_test_z_position
vel_test_z_linear_velocity

The datasets were collected with gazebo simulator, px4 sitl, and rovio position feedback.

Using GLSL 3.00 ES

Can the shader files be modified to use GLSL version 3.00 ES easily ?
Or can the node be compiled to not require the graphic output used by the shaders ?

Thanks.

Trouble linking compilation to OpenCV3

I have been trying to compile ROVIO with OPENCV3. I have successfully compiled ROVIO on an older system with OPENCV2.

I believe the issue to be with ROVIO being unable to find the correct paths to OPENCV3. It is worth noting that OPENCV2 (libopencv-dev) is installed as a result of mavros-extras package. I tried installing ros-indigo-opencv3 to see if it would fix the pointer issues. I am unsure how to proceed at this point. I have attached my latest build log.

build.make.008.log.txt

Error opening file?

I'm using Ubuntu 14.04 with ROS indigo.

The launch file is set to run only one camera and the IMU and camera topics are remapped to the corresponding topics in my bag file.

The error shows:

Recording
Storing output to: /path/to/file.bag
terminate called after throwing an instance of 'rosbag::BagIOException'
what(): Error opening file: /path/to/file.bag

Anyway to resolve this?

Thanks.

Segmentation fault (core dumped) / process has died /

I installed rovio (never used before) on a VM and a regular laptop (both with Ubuntu 16.04 and ros kinetic)

On both maschines I tried running
rosrun rovio rovio_node
rosrun rovio rovio_rosbag_loader

Both terminate with
Segmentation fault (core dumped)

If I run
gdb devel/.private/rovio/lib/rovio/rovio_node
from catkin workspace I get this:
`(gdb) run
Starting program: /home/jaeger/catkin-ws/devel/.private/rovio/lib/rovio/rovio_node
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
[New Thread 0x7fffe6a4c700 (LWP 120143)]
[New Thread 0x7fffe624b700 (LWP 120144)]
[New Thread 0x7fffe5a4a700 (LWP 120145)]
[New Thread 0x7fffe5249700 (LWP 120150)]

Thread 1 "rovio_node" received signal SIGSEGV, Segmentation fault.
0x000000000052b39d in rovio::State<25u, 4, 6, 1, 0>::State() ()
(gdb) bt
#0 0x000000000052b39d in rovio::State<25u, 4, 6, 1, 0>::State() ()
#1 0x0000000000541cad in LWF::FilterState<rovio::State<25u, 4, 6, 1, 0>, rovio::PredictionMeas, rovio::PredictionNoise<rovio::State<25u, 4, 6, 1, 0> >, 0u>::FilterState() ()
#2 0x000000000054b008 in rovio::FilterState<25u, 4, 6, 1, 0>::FilterState() ()
#3 0x0000000000551307 in LWF::FilterBase<rovio::ImuPrediction<rovio::FilterState<25u, 4, 6, 1, 0> >, rovio::ImgUpdate<rovio::FilterState<25u, 4, 6, 1, 0> >, rovio::PoseUpdate<rovio::FilterState<25u, 4, 6, 1, 0>, -1, -1> >::FilterBase() ()
#4 0x000000000055265c in rovio::RovioFilter<rovio::FilterState<25u, 4, 6, 1, 0> >::RovioFilter() ()
#5 0x000000000046acbc in main ()

(gdb) `

And when I try to run roslaunch rovio rovio_rosbag_node.launch (with the Machine Hall 01 rosbag) I get this:
rovio_rosbag_node.launch :
`

`

Output:
`...
ROS_MASTER_URI=http://localhost:11311

core service [/rosout] found
process[rovio-1]: started with pid [120282]
[rovio-1] process has died [pid 120282, exit code -11, cmd /home/jaeger/catkin-ws/devel/lib/rovio/rovio_rosbag_loader __name:=rovio _log:=/home/jaeger/.ros/log/0eb47d88-6864-11e6-9f29-000c29239801/rovio-1.log].
log file: /home/jaeger/.ros/log/0eb47d88-6864-11e6-9f29-000c29239801/rovio-1
.log
all processes on machine have died, roslaunch will exit
shutting down processing monitor...
... shutting down processing monitor complete
done
`
and no log files unter "/home/jaeger/.ros/log/_" contain any more detail.

I also tested the rosbags and they are not corrupted.

What can I do? What did I do wrong? What did I miss?

build error (Eigen missing)

Hi, all,

I got the following build error. It seems Eigen is missing in my system (ROS Indigo). Could someone suggest me how to fix it? Thanks

root@milton-ThinkPad-T450:~/catkin_ws# catkin build rovio --cmake-args -DCMAKE_BUILD_TYPE=Release
-------------------------------------------------------
Profile:                     default
Extending:          [cached] /opt/ros/indigo
Workspace:                   /root/catkin_ws
-------------------------------------------------------
Source Space:       [exists] /root/catkin_ws/src
Log Space:          [exists] /root/catkin_ws/logs
Build Space:        [exists] /root/catkin_ws/build
Devel Space:        [exists] /root/catkin_ws/devel
Install Space:      [unused] /root/catkin_ws/install
DESTDIR:            [unused] None
-------------------------------------------------------
Devel Space Layout:          linked
Install Space Layout:        None
-------------------------------------------------------
Additional CMake Args:       -DCMAKE_BUILD_TYPE=Release
Additional Make Args:        None
Additional catkin Make Args: None
Internal Make Job Server:    True
Cache Job Environments:      False
-------------------------------------------------------
Whitelisted Packages:        None
Blacklisted Packages:        None
-------------------------------------------------------
Workspace configuration appears valid.

NOTE: Forcing CMake to run for each package.
-------------------------------------------------------
[build] Found '4' packages in 0.0 seconds.                                     
[build] Package table is up to date.                                           
Warning: devel space setup files have an unknown origin.
Starting  >>> rovio                                                            
_______________________________________________________________________________
Errors     << rovio:cmake /root/catkin_ws/logs/rovio/build.cmake.006.log       
CMake Error at /root/catkin_ws/src/rovio/lightweight_filtering/CMakeLists.txt:15 (find_package):
  By not providing "FindEigen3.cmake" in CMAKE_MODULE_PATH this project has
  asked CMake to find a package configuration file provided by "Eigen3", but
  CMake did not find one.

  Could not find a package configuration file provided by "Eigen3" with any
  of the following names:

    Eigen3Config.cmake
    eigen3-config.cmake

  Add the installation prefix of "Eigen3" to CMAKE_PREFIX_PATH or set
  "Eigen3_DIR" to a directory containing one of the above files.  If "Eigen3"
  provides a separate development package or SDK, be sure it has been
  installed.


cd /root/catkin_ws/build/rovio; catkin build --get-env rovio | catkin env -si  /usr/local/bin/cmake /root/catkin_ws/src/rovio --no-warn-unused-cli -DCATKIN_DEVEL_PREFIX=/root/catkin_ws/devel/.private/rovio -DCMAKE_INSTALL_PREFIX=/root/catkin_ws/install -DCMAKE_BUILD_TYPE=Release; cd -
...............................................................................
Failed     << rovio:cmake          [ Exited with code 1 ]                      
Failed    <<< rovio                [ 0.0 seconds ]                             
[build] Summary: 0 of 1 packages succeeded.                                    
[build]   Ignored:   3 packages were skipped or are blacklisted.               
[build]   Warnings:  None.                                                     
[build]   Abandoned: None.                                                     
[build]   Failed:    1 packages failed.                                        
[build] Runtime: 0.2 seconds total.                                            
root@milton-ThinkPad-T450:~/catkin_ws# 

The various coordinate frames used in ROVIO

It'd be great if we had a description of the various coordinate frames used in ROVIO and the acronyms used to refer to them in the code.

We have the
IMU CF:
Located at IMU origin and axis aligned along IMU's internal accelerometers.
Referred with letter M in the rovio.info file.

Camera CF:
Located at camera principle point and axis along the image plane x and y axis and z going out of the image plane.
Referred with letter C in the rovio.info file.

Reference Body CF:
Referred with letter V in the rovio.info file.

Reference Inertial CF:
Referred with letter I in the rovio.info file.

World CF:
Referred with letter W in the rovio.info file.

I created a rough diagram from what I got in the discussion with @raghavkhanna and @bloesch earlier:

rovio cf - new page 1

Correct me where I am wrong.

Performance with 10-20m scene depth

Hi Michael,
testing of rovio while hand-holding a custom-made synchronized IMU-camera sensor has been great so far. However, flying at higher altitudes (10-20m) with common automated waypoint-based UAV patterns has shown some divergence problems as mentioned in the paper:

Furthermore, we also observed a divergence mode for the
presented approach. It can occur when the velocity estimate
diverge, e.g., due to missing motion or too many outliers. The
problem is then, that the filter attempts to minimize the effect
of the erroneous velocity on the bearing vectors by setting
the distance of the features to infinity. This again lowers
any corrective effect on the diverging velocity resulting in
further divergence

In these situations, the feature tracking seemed to work pretty well, it's usually the depth propagation that becomes bad enough over time. Neither the zero-velocity updates nor some tricks I added (like using the barometer altitude or climb speed) have made much impact.

I was wondering if you had done experiments in similar situations or had switched to a stereo setup as stated in the conclusion of the paper.

Building with OpenCV3

I'm trying to build Rovio with OpenCV 3, which I installed from source. I fixed a couple of the smaller issues which gets me to this stage :

[ 14%] [ 28%] Building CXX object CMakeFiles/rovio.dir/src/rovio_node.cpp.o
Building CXX object CMakeFiles/rovio.dir/src/FeatureCoordinates.cpp.o
In file included from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/MultilevelPatch.hpp:34:0,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FeatureManager.hpp:35,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FilterStates.hpp:38,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RovioFilter.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/src/rovio_node.cpp:31:
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImagePyramid.hpp: In instantiation of void rovio::ImagePyramid<n_levels>::detectFastCorners(std::vector<rovio::FeatureCoordinates>&, int, int) const [with int n_levels = 4]:
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImgUpdate.hpp:906:11:   required from void rovio::ImgUpdate<FILTERSTATE>::commonPostProcess(rovio::ImgUpdate<FILTERSTATE>::mtFilterState&, const mtMeas&) [with FILTERSTATE = rovio::FilterState<25u, 4, 8, 2, 0>; rovio::ImgUpdate<FILTERSTATE>::mtFilterState = rovio::FilterState<25u, 4, 8, 2, 0>; rovio::ImgUpdate<FILTERSTATE>::mtMeas = rovio::ImgUpdateMeas<rovio::State<25u, 4, 8, 2, 0> >]
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImgUpdate.hpp:706:41:   required from void rovio::ImgUpdate<FILTERSTATE>::postProcess(rovio::ImgUpdate<FILTERSTATE>::mtFilterState&, const mtMeas&, const mtOutlierDetection&, bool&) [with FILTERSTATE = rovio::FilterState<25u, 4, 8, 2, 0>; rovio::ImgUpdate<FILTERSTATE>::mtFilterState = rovio::FilterState<25u, 4, 8, 2, 0>; rovio::ImgUpdate<FILTERSTATE>::mtMeas = rovio::ImgUpdateMeas<rovio::State<25u, 4, 8, 2, 0> >; rovio::ImgUpdate<FILTERSTATE>::mtOutlierDetection = rovio::ImgOutlierDetection<rovio::State<25u, 4, 8, 2, 0> >]
/home/artemis/artemis_ws/src/vision/rovio/src/rovio_node.cpp:118:1:   required from here
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImagePyramid.hpp:137:75: error: no matching function for call to cv::FastFeatureDetector::FastFeatureDetector(int&, bool)
     cv::FastFeatureDetector feature_detector_fast(detectionThreshold, true);
                                                                           ^
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImagePyramid.hpp:137:75: note: candidates are:
In file included from /usr/local/include/opencv2/features2d/features2d.hpp:48:0,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FeatureCoordinates.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RobocentricFeatureElement.hpp:33,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/CoordinateTransform/FeatureOutput.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FilterStates.hpp:36,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RovioFilter.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/src/rovio_node.cpp:31:
/usr/local/include/opencv2/features2d.hpp:411:20: note: cv::FastFeatureDetector::FastFeatureDetector()
 class CV_EXPORTS_W FastFeatureDetector : public Feature2D
                    ^
/usr/local/include/opencv2/features2d.hpp:411:20: note:   candidate expects 0 arguments, 2 provided
/usr/local/include/opencv2/features2d.hpp:411:20: note: cv::FastFeatureDetector::FastFeatureDetector(const cv::FastFeatureDetector&)
/usr/local/include/opencv2/features2d.hpp:411:20: note:   candidate expects 1 argument, 2 provided
/usr/local/include/opencv2/features2d.hpp:411:20: note: cv::FastFeatureDetector::FastFeatureDetector(cv::FastFeatureDetector&&)
/usr/local/include/opencv2/features2d.hpp:411:20: note:   candidate expects 1 argument, 2 provided
In file included from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/MultilevelPatch.hpp:34:0,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FeatureManager.hpp:35,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FilterStates.hpp:38,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RovioFilter.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/src/rovio_node.cpp:31:
/home/artemis/artemis_ws/src/vision/rovio/include/rovio/ImagePyramid.hpp:137:29: error: cannot declare variable feature_detector_fast to be of abstract type cv::FastFeatureDetector
     cv::FastFeatureDetector feature_detector_fast(detectionThreshold, true);
                             ^
In file included from /usr/local/include/opencv2/features2d/features2d.hpp:48:0,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FeatureCoordinates.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RobocentricFeatureElement.hpp:33,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/CoordinateTransform/FeatureOutput.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/FilterStates.hpp:36,
                 from /home/artemis/artemis_ws/src/vision/rovio/include/rovio/RovioFilter.hpp:34,
                 from /home/artemis/artemis_ws/src/vision/rovio/src/rovio_node.cpp:31:
/usr/local/include/opencv2/features2d.hpp:411:20: note:   because the following virtual functions are pure within cv::FastFeatureDetector:
 class CV_EXPORTS_W FastFeatureDetector : public Feature2D
                    ^
/usr/local/include/opencv2/features2d.hpp:424:26: note:     virtual void cv::FastFeatureDetector::setThreshold(int)
     CV_WRAP virtual void setThreshold(int threshold) = 0;
                          ^
/usr/local/include/opencv2/features2d.hpp:425:25: note:     virtual int cv::FastFeatureDetector::getThreshold() const
     CV_WRAP virtual int getThreshold() const = 0;
                         ^
/usr/local/include/opencv2/features2d.hpp:427:26: note:     virtual void cv::FastFeatureDetector::setNonmaxSuppression(bool)
     CV_WRAP virtual void setNonmaxSuppression(bool f) = 0;
                          ^
/usr/local/include/opencv2/features2d.hpp:428:26: note:     virtual bool cv::FastFeatureDetector::getNonmaxSuppression() const
     CV_WRAP virtual bool getNonmaxSuppression() const = 0;
                          ^
/usr/local/include/opencv2/features2d.hpp:430:26: note:     virtual void cv::FastFeatureDetector::setType(int)
     CV_WRAP virtual void setType(int type) = 0;
                          ^
/usr/local/include/opencv2/features2d.hpp:431:25: note:     virtual int cv::FastFeatureDetector::getType() const
     CV_WRAP virtual int getType() const = 0;

Any help with getting it running would be great! Thanks!

Workability on KITTI

Hello everyone.
I was interested to find out how rovio would do on some universally known dataset. I tried to run it on a sequence from KITTI, yet got no sensible output there.
To sum up my endeavours:

  • I followed wiki on configuration suggestions

  • I also have implemented suggestions from issue #62, namely tried to checkout to January revision, suggested there (with none improvement).

  • My setup works exactly as it is shown in the demonstration video on the author's Machine Hall dataset.

  • I was tending to blame Camera-IMU calibration, so I even tried to pass exact calibration provided by KITTI.

    I'm going to attach KITTI sequence in form of rosbag, and parameters I used. The screenshots attached - is the most workable version I managed to get. It is the third and fourth frames of dataset, around that moment valid keypoints disappear to never appear again.
    Thank you for you suggestions!

kitti_rovio_3rd
kitti_rovio_4th

Launch file:

<?xml version="1.0" encoding="UTF-8"?> 
<launch>
  <node pkg="rovio" type="rovio_rosbag_loader" name="rovio_bag_reader_kitti" output="screen">
  <param name="filter_config" value="$(find rovio)/cfg/rovio_kitti.info"/>
  <param name="camera0_config" value="$(find rovio)/cfg/euroc_cam0_kitti.yaml"/>
  <param name="rosbag_filename" value="/home/XXX/datasets/rovio_bags/kitti_longer.bag"/>
  <param name="imu_topic_name" value="/imu0"/>
  <param name="cam0_topic_name" value="/cam0/image_raw"/>
  </node>
</launch>

Camera Calibration:

image_width: 1242
image_height: 375
camera_name: cam0
camera_matrix:
  rows: 3
  cols: 3
  data: [7.215377e+02, 0.000000e+00, 6.095593e+02, 0.000000e+00, 7.215377e+02, 1.728540e+02, 0.000000e+00, 0.000000e+00, 1.000000e+00]
distortion_model: plumb_bob
distortion_coefficients:
  rows: 1
  cols: 5
  data:  [0.0,  0.0, 0.0, 0.0, 0.0]

Rovio configuration:

Common
{
    doVECalibration false;      Should the camera-IMU extrinsics be calibrated online
    depthType 1;                Type of depth parametrization (0: normal, 1: inverse depth, 2: log, 3: hyperbolic)
    verbose false;              Is the verbose active
}
Camera0
{
     qCM_x 0.5010303840;
     qCM_y -0.4989766415;
     qCM_z 0.4982646186;
     qCM_w 0.5017202760;
     MrMC_x 0.8127456660;
     MrMC_y -0.2432397200;
     MrMC_z 1.0715037000;
}
Init
{
    State
    {
        pos_0 0;            X-entry of initial position (world to IMU in world) [m]
        pos_1 0;            Y-entry of initial position (world to IMU in world) [m]
        pos_2 0;            Z-entry of initial position (world to IMU in world) [m]
        vel_0 0;            X-entry of initial velocity (robocentric, IMU) [m/s]
        vel_1 0;            Y-entry of initial velocity (robocentric, IMU) [m/s]
        vel_2 0;            Z-entry of initial velocity (robocentric, IMU) [m/s]
        acb_0 0;            X-entry of accelerometer bias [m/s^2]
        acb_1 0;            Y-entry of accelerometer bias [m/s^2]
        acb_2 0;            Z-entry of accelerometer bias [m/s^2]
        gyb_0 0;            X-entry of gyroscope bias [rad/s]
        gyb_1 0;            Y-entry of gyroscope bias [rad/s]
        gyb_2 0;            Z-entry of gyroscope bias [rad/s]
        att_x 0;            X-entry of initial attitude (IMU to world, JPL)
        att_y 0;            Y-entry of initial attitude (IMU to world, JPL)
        att_z 0;            Z-entry of initial attitude (IMU to world, JPL)
        att_w 1;            W-entry of initial attitude (IMU to world, JPL)
    }
    Covariance
    {
        pos_0 0.0001;       X-Covariance of initial position [m^2]
        pos_1 0.0001;       Y-Covariance of initial position [m^2]
        pos_2 0.0001;       Z-Covariance of initial position [m^2]
        vel_0 1.0;          X-Covariance of initial velocity [m^2/s^2]
        vel_1 1.0;          Y-Covariance of initial velocity [m^2/s^2]
        vel_2 1.0;          Z-Covariance of initial velocity [m^2/s^2]
        acb_0 4e-4;         X-Covariance of initial accelerometer bias [m^2/s^4]
        acb_1 4e-4;         Y-Covariance of initial accelerometer bias [m^2/s^4]
        acb_2 4e-4;         Z-Covariance of initial accelerometer bias [m^2/s^4]
        gyb_0 3e-4;         X-Covariance of initial gyroscope bias [rad^2/s^2]
        gyb_1 3e-4;         Y-Covariance of initial gyroscope bias [rad^2/s^2]
        gyb_2 3e-4;         Z-Covariance of initial gyroscope bias [rad^2/s^2]
        vep 0.0001;         Covariance of initial linear camera-IMU extrinsics, same for all entries [m^2]
        att_0 0.1;          X-Covariance of initial attitude [rad^2]
        att_1 0.1;          Y-Covariance of initial attitude [rad^2]
        att_2 0.1;          Z-Covariance of initial attitude [rad^2]
        vea 0.01;           Covariance of initial rotational camera-IMU extrinsics, same for all entries [rad^2]
    }
}
ImgUpdate
{
    updateVecNormTermination 1e-4;
    maxNumIteration 20;
    doPatchWarping true;                                        Should the patches be warped
    doFrameVisualisation true;                                      Should the frames be visualized
    visualizePatches false;                                     Should the patches be visualized
    useDirectMethod true;                                       Should the EKF-innovation be based on direct intensity error (o.w. reprojection error)
    startLevel 2;                                               Highest patch level which is being employed (must be smaller than the hardcoded template parameter)
    endLevel 1;                                                 Lowest patch level which is being employed
    nDetectionBuckets 100;                                      Number of discretization buckets used during the candidates selection
    MahalanobisTh 9.21;                                         Mahalanobis treshold for the update, 5.8858356 9.21
    UpdateNoise
    {
        pix 2;                                                  Covariance used for the reprojection error, 1/focal_length is roughly 1 pixel std [rad] (~1/f ~ 1/400^2 = 1/160000)
        int 400;                                                Covariance used for the photometric error [intensity^2]
    }
    initCovFeature_0 4.0;0.5;                                       Covariance for the initial distance (Relative to initialization depth [m^2/m^2])
    initCovFeature_1 1e-5;                                      Covariance for the initial bearing vector, x-component [rad^2]
    initCovFeature_2 1e-5;                                      Covariance for the initial bearing vector, y-component [rad^2]
    initDepth 0.5;                                              Initial value for the initial distance parameter
    startDetectionTh 0.8;                                       Threshold for detecting new features (min: 0, max: 1)
    scoreDetectionExponent 0.25;                                Exponent used for weighting the distance between candidates
    penaltyDistance 50;100;                                     Maximal distance which gets penalized during bucketing [pix]
    zeroDistancePenalty 100;                                    Penalty for zero distance (max: nDetectionBuckets)
    statLocalQualityRange 10;                                   Number of frames for local quality evaluation
    statLocalVisibilityRange 100;                               Number of frames for local visibility evaluation
    statMinGlobalQualityRange 100;                              Minimum number of frames for obtaining maximal global quality
    trackingUpperBound 0.9;                                     Threshold for local quality for min overall global quality
    trackingLowerBound 0.8;                                     Threshold for local quality for max overall global quality
    minTrackedAndFreeFeatures 0.75;                             Minimum of amount of feature which are either tracked or free
    removalFactor 1.1;                                          Factor for enforcing feature removal if not enough free
    minRelativeSTScore 0.75;                                    Minimum relative ST-score for extracting new feature patch
    minAbsoluteSTScore 5.0;                                     Minimum absolute ST-score for extracting new feature patch
    minTimeBetweenPatchUpdate 1.0;                              Minimum time between new multilevel patch extrection [s]
    fastDetectionThreshold 5;                                   Fast corner detector treshold while adding new feature
    alignConvergencePixelRange 10;                              Assumed convergence range for image alignment (gets scaled with the level) [pixels]
    alignCoverageRatio 2;                                       How many sigma of the uncertainty should be covered in the adaptive alignement
    alignMaxUniSample 1;                                        Maximal number of alignment seeds on one side -> total number of sample = 2n+1. Carefull can get very costly if diverging!
    patchRejectionTh 50.0;                                      If the average itensity error is larger than this the feauture is rejected [intensity], if smaller 0 the no check is performed
    alignmentHuberNormThreshold 10;                             Intensity error threshold for Huber norm (enabled if > 0)
    alignmentGaussianWeightingSigma -1;                         Width of Gaussian which is used for pixel error weighting (enabled if > 0)
    alignmentGradientExponent 0.0;                              Exponent used for gradient based weighting of residuals
    useIntensityOffsetForAlignment true;                        Should an intensity offset between the patches be considered
    useIntensitySqewForAlignment true;                          Should an intensity sqew between the patches be considered
    removeNegativeFeatureAfterUpdate true;                      Should feature with negative distance get removed
    maxUncertaintyToDepthRatioForDepthInitialization 0.3;       If set to 0.0 the depth is initialized with the standard value provided above, otherwise ROVIO attempts to figure out a median depth in each frame
    useCrossCameraMeasurements true;                            Should cross measurements between frame be used. Might be turned of in calibration phase.
    doStereoInitialization true;                                Should a stereo match be used for feature initialization.
    noiseGainForOffCamera 10.0;                                 Factor added on update noise if not main camera
    discriminativeSamplingDistance 0.02;                        Sampling distance for checking discriminativity of patch (if <= 0.0 no check is performed).
    discriminativeSamplingGain 1.1;                             Gain for threshold above which the samples must lie (if <= 1.0 the patchRejectionTh is used).
    MotionDetection
    {
        isEnabled 0;                                            Is the motion detection enabled
        rateOfMovingFeaturesTh 0.5;                             Amount of feature with motion for overall motion detection
        pixelCoordinateMotionTh 1.0;                            Threshold for motion detection for patched [pixels]
        minFeatureCountForNoMotionDetection 5;                  Min feature count in frame for motion detection
    }
    ZeroVelocityUpdate
    {
        UpdateNoise
        {
            vel_0 0.01;                                         X-Covariance of zero velocity update [m^2/s^2]
            vel_1 0.01;                                         Y-Covariance of zero velocity update [m^2/s^2]
            vel_2 0.01;                                         Z-Covariance of zero velocity update [m^2/s^2]
        }
        MahalanobisTh0 7.689997599999999;                       Mahalanobid distance for zero velocity updates
        minNoMotionTime 1.0;                                    Min duration of no-motion
        isEnabled 0;                                            Should zero velocity update be applied, only works if MotionDetection.isEnabled is true
    }
}
Prediction
{
    PredictionNoise
    {
        pos_0 1e-4;                             X-covariance parameter of position prediction [m^2/s]
        pos_1 1e-4;                             Y-covariance parameter of position prediction [m^2/s]
        pos_2 1e-4;                             Z-covariance parameter of position prediction [m^2/s]
        vel_0 4e-5; 4e-6                        X-covariance parameter of velocity prediction [m^2/s^3]
        vel_1 4e-5;                             Y-covariance parameter of velocity prediction [m^2/s^3]
        vel_2 4e-5;                             Z-covariance parameter of velocity prediction [m^2/s^3]
        acb_0 1e-8;                             X-covariance parameter of accelerometer bias prediction [m^2/s^5]
        acb_1 1e-8;                             Y-covariance parameter of accelerometer bias prediction [m^2/s^5]
        acb_2 1e-8;                             Z-covariance parameter of accelerometer bias prediction [m^2/s^5]
        gyb_0 3.8e-7;                           X-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        gyb_1 3.8e-7;                           Y-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        gyb_2 3.8e-7;                           Z-covariance parameter of gyroscope bias prediction [rad^2/s^3]
        vep 1e-8;                               Covariance parameter of linear extrinsics prediction [m^2/s]
        att_0 7.6e-7; 7.6e-7                    X-Covariance parameter of attitude prediction [rad^2/s]
        att_1 7.6e-7;                           Y-Covariance parameter of attitude prediction [rad^2/s]
        att_2 7.6e-7;                           Z-Covariance parameter of attitude prediction [rad^2/s]
        vea 1e-8;                               Covariance parameter of rotational extrinsics prediction [rad^2/s]
        dep 0.0001;                         Covariance parameter of distance prediction [m^2/s]
        nor 0.00001;                                Covariance parameter of bearing vector prediction [rad^2/s]
    }
    MotionDetection
    {
        inertialMotionRorTh 0.1;                Treshold on rotational rate for motion detection [rad/s]
        inertialMotionAccTh 0.5;                Treshold on acceleration for motion detection [m/s^2]
    }
}
PoseUpdate
{
    UpdateNoise
    {
        pos_0 0.01;                         X-Covariance of linear pose measurement update [m^2]
        pos_1 0.01;                         Y-Covariance of linear pose measurement update [m^2]
        pos_2 0.01;                         Z-Covariance of linear pose measurement update [m^2]
        att_0 0.01;                         X-Covariance of rotational pose measurement update [rad^2]
        att_1 0.01;                         Y-Covariance of rotational pose measurement update [rad^2]
        att_2 0.01;                         Z-Covariance of rotational pose measurement update [rad^2]
    }
    init_cov_IrIW 1;                            Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qWI 1;                             Covariance of initial pose between inertial frames, rotational part [rad^2]
    init_cov_MrMV 1;                            Covariance of initial pose between inertial frames, linear part [m^2]
    init_cov_qVM 1;                             Covariance of initial pose between inertial frames, rotational part [rad^2]
    pre_cov_IrIW 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qWI 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    pre_cov_MrMV 1e-4;                          Covariance parameter of pose between inertial frames, linear part [m^2/s]
    pre_cov_qVM 1e-4;                           Covariance parameter of pose between inertial frames, rotational part [rad^2/s]
    MahalanobisTh0 12.6511204;                  Mahalonobis distance treshold of pose measurement
    doVisualization false;                      Should the measured poses be vizualized
    enablePosition true;                        Should the linear part be used during the update
    enableAttitude true;                        Should the rotation part be used during the update (e.g. set to fals eif feeding GPS-measurement)
    noFeedbackToRovio true;                     By default the pose update is only used for aligning coordinate frame. Set to false if ROVIO should benefit frome the posed measurements.
    doInertialAlignmentAtStart true;            Should the transformation between I and W be explicitly computed and set with the first pose measurement.
    timeOffset 0.0;                             Time offset added to the pose measurement timestamps
    qVM_x 0;                                    X-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_y 0;                                    Y-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_z 0;                                    Z-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    qVM_w 1;                                    W-entry of quaterion representing IMU to reference body coordinate frame of pose measurement (JPL)
    MrMV_x 0;                                   X-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_y 0;                                   Y-entry of vector representing IMU to reference body coordinate frame of pose measurement
    MrMV_z 0;                                   Z-entry of vector representing IMU to reference body coordinate frame of pose measurement
    qWI_x 0;                                    X-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_y 0;                                    Y-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_z 0;                                    Z-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    qWI_w 1;                                    W-entry of quaterion representing World to reference inertial coordinate frame of pose measurement (JPL)
    IrIW_x 0;                                   X-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_y 0;                                   Y-entry of vector representing World to reference inertial coordinate frame of pose measurement
    IrIW_z 0;                                   Z-entry of vector representing World to reference inertial coordinate frame of pose measurement
}

medianDepth for new features initialization

In image update

// Get new features
if(filterState.fsm_.getValidCount() < startDetectionTh_*mtState::nMax_){
  // Compute the median depth parameters for each camera, using the state features.
  std::array<double, mtState::nCam_> medianDepthParameters;
  if(maxUncertaintyToDepthRatioForDepthInitialization_>0){
    filterState.getMedianDepthParameters(initDepth_, &medianDepthParameters,maxUncertaintyToDepthRatioForDepthInitialization_);
  } else {
    medianDepthParameters.fill(initDepth_);
  }

Two things to get better initialization estimate when a new frame loses too many features:

  • Computing medianDepthParameters for every frame rather just the frames where number of tracked features is low, since frames with many tracked features gives better initialization estimate
  • Assume some smoothness in motion and use previous frame's median for current frame depth initialization.

This handles the vicious cycle where I suddenly lose most features in the current frame so its mediandepth for initialization is not good enough, which leads to bad depth initialization of all new features which leads to tracking failure for most of them in the subsequent imgUpdate and so on till it diverges. This for example happened when I am about to land close to the ground. I tested this strategy and it works better than the one we currently have.

Are any datasets available?

Hi,

I was wondering if there were any bag files available, or if any would be made available eventually so that I can try this out and get a feel of how it works.

Thanks

camera drift away with low level IMU module

We want run ROVIO with our IMU (mpu6050 200Hz) module and camera(logitech c525, 30Hz) instead of provided dataset, and we followed the instruction noted in https://github.com/ethz-asl/rovio/wiki/Configuration, particularly We changed the IMU-Camera configration as figure shown below.
jietu

For test we fixed our device on the table and insured it without any movement, but it drift away unexpectedly. We uploaded the result and you may check it with this link https://youtu.be/mDRDnI2Tam4.

Would you like to give me some advices what should I do to fix it out.

Looking forward to your reply!

Merci beaucoup.

references for the EKF implementation notations

hi, does someone has the references for the lightweight filter implementation of EKF? such as what do F, G mean in the prediction procedure. I need the ref formula to understand the codes

Response to gazebo image stream

I am trying to run the ROVIO pipeline on the Imu + Vision stream coming from the gazebo iris model's vi-sensor from rotorSimulator(to eventually use the pose update from ROVIO in the feedback loop with pixhawk SITL).

The estimator seems to diverge for simple hovering datasets in a few seconds. I have updated the .yaml files and the rovio.info file to use the calibration file of the gazebo's camera, and to set the camera-IMU transformation (it's identity). The image scenery is fairly textured (though planar), and the detector finds and tracks features for a while, before the depth uncertainty blows up.

I have rovio running decently on other "real" vi-sensor datasets, and the one difference I see with the image stream coming out of gazebo is that the framerate varies a lot(see ethz-asl/rotors_simulator#277). Could this cause the estimator to blow up ?

GPU (CUDA) acceleration for Rovio

@bloesch I was looking into speeding up Rovio on Nvidia GPUs using CUDA.

Is there a timing profile for Rovio , or would you be able to advise on which sections of Rovio it would be most beneficial to parallelize on the GPU to free up CPU (especially in the stereo case).

Camera calibration problems?

Hi ,
I am evaluating rovio for our mining project. I spend more than 2 weeks on it and can not make it work with my camera. I am embedded software engineer but quite new to ROS and machine vision.
In short I can play one of the Euroc test dataset and is seems to be happy with it Youtube Euroc but when I run the it on my vehicle the odometry that comes back is just drifting in some particular direction (10'th of meters in minute).
I recorded bag and played it on my laptop, I think this may be the biggest clue but I do not know what I am looking at Youtube lab set
My intuition tells me it is either camera calibration or imu:

  1. Camera calibration I tried Kalibre and http://wiki.ros.org/camera_calibration. The videos are with the later. The Kalibre output only radial-tangental model to convert it to plumb-bob that is supported by rovio by adding 0.0 in the last parameter (plumb-bob requires 5 and Kalibre radial-tangental only gives 4) Is that correct assumption?.

  2. Test failed test during start, In both cases euroc:
    rovio_eurac.txt
    and my lab set:
    rovio_lab_set.txt
    there is test failed at the startup:
    Testing pixelOutputCT (can sometimes exhibit large absolut errors due to the float precision)
    ==== Model jacInput Test failed: 55.5229 is larger than 1 at row 1(def_0) and col 0() ====
    72399.3 72343.8
    What does that signify?

  3. Imu, I am using pixhawk with PX4 flight stack and mavros to get the imu data to odroid computer where rovio runs. The Kalibr tool when running dynamic calibration report ~15ms delay in imu data
    result-cam_dynamic.txt
    . is it usable? Or I need another Imu. Would be nice to get hands on VI Sensor but I do not see it available anywhere. Looks like GoPro snatch the company?

I notice going trough other issues the rotation and translation between imu and camera is important. I basically took the original rovio.info file and replaced the quaternions and translations with the output from Kalibr dynamic calibration. I also calculated it manually and those align well so I am confident in this part.

If this help here are the launch files:
miner_rovio.zip

Can you please point me in right direction?
P.S. If I get this working we may use it in commercial project. I assume your license statement is current?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.