Giter Club home page Giter Club logo

livox_camera_lidar_calibration's Introduction

中文文档

Camera-LiDAR-Calibration Manual

This solution provides a method for manually calibrating the extrinsic parameters between Livox LiDAR and camera, which has been verified on series Mid-40, Horizon and Tele-15. It includes the calibration of camera intrinsic parameters, obtaining of calibration data, the calculation of extrinsic parameters between camera and LiDAR, and some examples of camera LiDAR fusion application. In this solution, the board corners are used as calibration targets. Thanks to the non-repetitive scanning characteristics of Livox LiDAR, we can find the accurate position of corner in high density point cloud more easily. At the same time, we can get a better calibration result and a better LiDAR camera fusion.

Use the following link to download an example of data, which is used to calibrate the intrinsic and extrinsic parameters. The data is already saved in the default path.

Data example

Step1: Environment configuration

(The following calibration process is under the Ubuntu 64-bit 16.04 environment)

1.1 Install environment and driver

Install the ROS environment,install Livox SDK and livox_ros_driver. You can skip this step if they are already installed.

# install Livox_SDK
git clone https://github.com/Livox-SDK/Livox-SDK.git
cd Livox-SDK
sudo ./third_party/apr/apr_build.sh
cd build && cmake ..
make
sudo make install

# install livox_ros_driver
cd ..
git clone https://github.com/Livox-SDK/livox_ros_driver.git ws_livox/src
cd ws_livox
catkin_make

1.2 Install dependencies

You can skip this step if they are already installed.

1.3 Download source code and compile

# install camera/lidar calibration package
cd src
git clone https://github.com/Livox-SDK/livox_camera_lidar_calibration.git
cd ..
catkin_make
source devel/setup.bash

1.4 Program nodes briefs

This project includes the following nodes:

  • cameraCalib - calibrate the camera intrinsic parameters

  • pcdTransfer - transfer the LiDAR rosbag to PCD files

  • cornerPhoto - obtain the corner of the photo

  • getExt1 - node1 to calculate the extrinsic parameters(optimize only the extrinsic)

  • getExt2 - node2 to calculate the extrinsic parameters(optimize the intrinsic and extrinsic together)

  • projectCloud - project the LiDAR point cloud on the photo

  • colorLidar - give color to the LiDAR point cloud

You can find the corresponding launch file in src/calibration/launch folder if you want to modify any launch file.

Step2: Calibration of camera intrinsic parameters

2.1 Preparations

Prepare a calibration chessboard (as shown below, which can be printed out).

Install MATLAB to calculate the extrinsic parameters, or use cameraCalib node.

Parameters adjustment

Make sure that the selected camera has a pinhole imaging model. At first, please adjust the basic parameters(including the focus and photo dimension, etc) of the camera itself. In this case we use the Livox Horizon Lidar and Haikang Industrial Camera MV-CE060-10UC , as shown in the figure below. To ensure that the field of view of the photo and the field of LiDAR point cloud are consistent, the photo dimension needs to be modified according to the FOV(field of view) of LiDAR, the Horizon ’s FOV is 81.7 (horizontal) x 25.1 (vertical), so the photos used in this case have a dimension of 1520 (width) x 568 (height) pixels.

2.2 MATLAB calibration

This method needs to install MATLAB to calculate the results, if you don't want to use MATLAB, please refer to 2.3.

Normally you need to prepare more than 20 photos, which should cover all the FOV. In order to get a better result, please keep a distance of 3 meters when shooting. As shown in the following figure.

We use the tool "cameraCalibrator" of MATLAB, it can get the following results, we need the first, second, and eleventh data.

2.3 cameraCalib calibration

(You can skip this step if you have already used 2.2 MATLAB calibration)

Configure the corresponding path and parameters in cameraCalib.launch after getting the photo data. The default path is to put the photo data under data/camera/photos, and write the corresponding photo names in data/camera/in.txt, as shown in the following figure.

Then run the command to start calibrating

roslaunch camera_lidar_calibration cameraCalib.launch

The calibration results will be saved in data/camera/result.txt, including reprojection errors, intrinsic parameters and distortion correction parameters.

2.4 Calibration result

  1. A 3x3 intrinsic matrix (IntrinsicMatrix) [Note 1]
  2. Five parameters for correcting distortion, k1, k2, p1, p2, k3

Step3: Preparations and data collections

3.1 Calibration scene preparation

In our project, we use the four corner points of the calibration board as the target points [Note 2]. It will be better to select a relatively empty environment for the calibration scene, to facilitate the identification of the calibration board and to ensure that the LiDAR has a certain distance from the calibration board. In this case, a calibration board(1x1.5m) made of low-reflectivity foam is used. We need to select at least 10 different angles and distances to place the calibration board (similiar to the placement in camera intrinsic calibration), preferably 3 meters away to get better accuracy, and it is better to collect datas from different positions and different angles [Note 3].

Place calibration board at different positions and angles

3.2 Connect the LiDAR

Check if the board corner is in the point cloud, use command to see the point cloud

roslaunch livox_ros_driver livox_lidar,rviz.launch

Use another command when recording the rosbag

roslaunch livox_ros_driver livox_lidar_msg.launch

Please refer to the following link https://github.com/Livox-SDK/livox_ros_driver to confirm that the data format of rosbag is "customMsg", which is required for the subsequent program.

3.3 Connect the camera

In this solution we use Haikang camera supervisor MVS to connect and turn on the camera, then check the photo quality and board corner in the photo.

3.4 Collect the photo and LiDAR data

  1. Take the photo

  2. Run the command to record the point cloud

rosbag record /livox/lidar
  1. Save a photo and a rosbag of 10 seconds for each data

  2. After collecting all the data, put photos in data/photo folder; LiDAR rosbag in data/lidar folder.

Step4: Calibration data acquisition

4.1 Parameter setting

Firstly save the intrinsic parameters and distortion correct parameters obtained in step2 in the path data/parameters/intrinsic.txt [Note 4]. Distortion corresponds to five distortion correction parameters, which in order are: k1, k2 (RadialDistortion), p1, p2(TangentialDistortion) and k3, normally set 0 by default.

4.2 Acquire the corner coordinates in photo

  1. Configure the launch file cornerPhoto.launch, and run
roslaunch camera_lidar_calibration cornerPhoto.launch
  1. The program will open the corresponding photo [Note 5]. On this UI interface, you can move the mouse to each corner of the calibration board, the corresponding coordinate data will be displayed in the bottom left of the window. Take an order, generally we start with the corner point of the upper left corner, and rotate counterclockwisely to note the coordinates of the four corner points.
  1. Select the displayed picture and press a random key to enter the coordinate input process. Enter the four corners “x y” in order, and there must be a space between x and y (for example: “635 487”). After that, enter “0 0” to end the input process (as shown in the figure below). The program will calculate four points in more precise float type, and save them in default path data/corner_photo.txt. Then press a random key to end the whole process.
  1. Change the path in cornerPhoto.launch, repeat the process until all the corner coordinates of photo are acquired.

4.3 Acquire the corner coordinates in point cloud

  1. Check the rosbag path in pcdTransfer.launch, set the number of rosbag, and name the rosbag 0.bag, 1.bag...etc.

  2. Run the command to convert the rosbag to PCD files in batches, they will be saved in defaut path data/pcdFiles.

roslaunch camera_lidar_calibration pcdTransfer.launch
  1. Use the PCL visualization program to open the PCD files, hold shift + left click to get the selected point coordinates [Note 6], please keep the same order of corner as the photos.
pcl_viewer -use_point_picking xx.pcd
  1. Save the xyz coordinates in the data/corner_lidar.txt by the following format of all the PCD files.

You can also use other point cloud visualization programs than pcl_viewer to obtain the corner coordinates.

Step5: Extrinsic calculation

5.1 Parameter setting

Extrinsic calculation node will read the calibration data in data/corner_photo.txt and data/corner_lidar.txt to calculate the extrinsic parameters, the data needs to be saved in a specific format to be correctly read by this node. Referring to the figure below, only the data with more than 10 letters in one line will be read as a calculation data, the title such as the 1 2 3 4, lidar0 and 0.bmp will not be read as calculation data. The program will stop reading the data and start calculating when it reads blank lines.

Before the calculation, check the calibration data to make sure that each line corresponds to the same corner and the data amount is the same [Note 7].

5.2 Calculation node getExt1

Configure the initial extrinsic value in the getExt1.launch file first [Note 8], then run the command to calculate the extrinsic parameters.

roslaunch camera_lidar_calibration getExt1.launch

The cost of each iteration operation will be printed out on the terminal, and the result will be saved in the path data/parameters/extrinsic.txt in the format of homogeneous matrix

We can evaluate the obtained extrinsics parameters by the cost of optimization and reprojection error [Note 9].

The data with large error will be printed on the terminal during the reprojection, we can redo the calculation after this outlier detection.

5.3 Calculation node getExt2

The node getExt1 only optimizes extrinsic parameters, while the node getExt2 will optimize the intrinsic and extrinsic parameters together. Run the command to get new intrinsic and extrinsic matrixs and do the reprojection with the new parameters.

roslaunch camera_lidar_calibration getExt2.launch

Normally the node getExt1 can get a good result, but if the result obtained always has a large cost after the initial value is verified and the outlier detection, we can try the node getExt2. Make sure the data amount is large before using this node, and well verify the result obtained.

If the result obtained by getExt2 is better after the verification, update the new intrinsic parameters in data/parameters/intrinsic.txt.

Step6: Results verification and related applications

6.1 Briefs

After obtaining extrinsic parameters, we can use two common applications to see the fusion result. The first one is the projection of the point cloud on the photo, the second one is the point cloud coloring [Note 10].

6.2 projection of point cloud on the photo

Set the rosbag and photo path in the projectCloud.launch file, run the command to project a certain number of point cloud on the photo

roslaunch camera_lidar_calibration projectCloud.launch
Photo projection

6.3 Point cloud coloring

Set the rosbag and photo path in the colorLidar.launch file, run the command and check coloring rendering in the rviz.

roslaunch camera_lidar_calibration colorLidar.launch
Coloring rendering

Notes

  1. The format of the intrinsic matrix is shown in the following figure. Generally, there are values in four positions of the matrix (0,0); (0,2); (1,1); (1,2).
  1. The basic principle of calibration is to calculate and obtain the conversion relationship between the xyz coordinates in the LiDAR coordinate system and the xy coordinates in the camera coordinate system of the same target. Because the corner point is a relatively obvious target in the point cloud and the photo, this method can reduce the calibration error.

  2. We can also use multiple calibration boards or a chessboard calibration board that can be recognized by the LiDAR.

  3. Be careful not to change the format, otherwise you need to modify the relevant code in the getIntrinsic() and getDistortion() functions in the common.h file in the program.

  4. The displayed picture is corrected by distortion correction parameters. Please check the photo correction quality. For example, the photo correction on the left in the figure below has a problem, the distortion parameters may be set incorrectly. The photo on the right is normal.

  1. After opening pcl_viewer, you can enter "h" in the terminal to get the guidance.

  2. Note that if a line has less than 10 letters, it cannot be read as calculated data, so if the point cloud xyz or photo xy coordinates are short, you need to add up to 10 letters.

  3. The default initial value in the program is set according to the coordinate system of Livox LiDAR, the relative position of the LiDAR and the camera, and should be modified according to the real situation. If the initial value is badly set, it may lead to bad optimization results.

  4. If the cost is still relatively large at the end of the iteration (for example, 10 to the power of 4), it may be a problem with the initial value setting in the launch file, the result is only a local optimal solution, then the initial value needs to be reset.

  5. The point cloud of the LiDAR can be projected to the corresponding position of the photo through the intrinsic and extrinsic matrix. The color of the point cloud will be displayed from blue to red to show the distance from near to distant; For the coloring, the corresponding camera pixel coordinates can be obtained by the point cloud coordinates, the intrinsic and extrinsic matrix, and the RGB information of this pixel can be assigned to the point cloud for display, so that the LiDAR point cloud can show with the real color.

livox_camera_lidar_calibration's People

Contributors

bigrobinson avatar livox-sdk avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

livox_camera_lidar_calibration's Issues

Coordinates not displayed using cornerPhoto

Using your sample data as a test, the cornerPhoto launch file opens the first image, but no coordinates are displayed:

image

Where can the setting be found for enabling coordinate display?

在使用cornerphoto.launch获取照片角坐标时,找不到照片,但是路径是没问题的,

setting /run_id to b54edd5c-5fff-11ed-b29a-19fefe31ebb3
process[rosout-1]: started with pid [9001]
started core service [/rosout]
process[cornerPhoto-2]: started with pid [9004]
Get the parameters from the launch file
No Picture found by filename: /home/li/Livox-SDK/ws_livox/src/camera_lidar_calibration/../../data/photo/1.bmp
[cornerPhoto-2] process has finished cleanly
log file: /home/li/.ros/log/b54edd5c-5fff-11ed-b29a-19fefe31ebb3/cornerPhoto-2*.log

Camera type

Good day.
Why does the camera need to be a "pinhole imaging model" ? isn't possible to use a normal camera with fix focus length?

Thanks in advance!

Excuse me. What is the reason and how to solve it

RLException: [launch] is neither a launch file in package [camera_lidar_calibration] nor is [camera_lidar_calibration] a launch file name The traceback for the exception was written to the log file mcs@mcs:~/catkin_ws/src$

[cameraCalib-2] process has died

hjw@hjw-desktop:~$ roslaunch camera_lidar_calibration cameraCalib.launch
... logging to /home/hjw/.ros/log/5d94e990-67a8-11eb-bf97-00e04c68010e/roslaunch-hjw-desktop-8219.log
Checking log directory for disk usage. This may take a while.
Press Ctrl-C to interrupt
Done checking log file disk usage. Usage is <1GB.

started roslaunch server http://localhost:36375/

SUMMARY

PARAMETERS

  • /camera_folder_path: /home/hjw/data/ca...
  • /camera_in_path: /home/hjw/data/ca...
  • /col_number: 6
  • /height: 27
  • /result_path: /home/hjw/data/ca...
  • /rosdistro: melodic
  • /rosversion: 1.14.10
  • /row_number: 9
  • /width: 27

NODES
/
cameraCalib (camera_lidar_calibration/cameraCalib)

auto-starting new master
process[master]: started with pid [8231]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to 5d94e990-67a8-11eb-bf97-00e04c68010e
process[rosout-1]: started with pid [8243]
started core service [/rosout]
process[cameraCalib-2]: started with pid [8246]
Get the parameters from the launch file
/home/hjw/data/camera/photos/18.bmp
OpenCV Error: Assertion failed (scn == 3 || scn == 4) in cvtColor, file /build/opencv-XDqSFW/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp, line 9716
terminate called after throwing an instance of 'cv::Exception'
what(): /build/opencv-XDqSFW/opencv-3.2.0+dfsg/modules/imgproc/src/color.cpp:9716: error: (-215) scn == 3 || scn == 4 in function cvtColor

[cameraCalib-2] process has died [pid 8246, exit code -6, cmd /home/hjw/camera_lidar_calibration/devel/lib/camera_lidar_calibration/cameraCalib __name:=cameraCalib __log:=/home/hjw/.ros/log/5d94e990-67a8-11eb-bf97-00e04c68010e/cameraCalib-2.log].
log file: /home/hjw/.ros/log/5d94e990-67a8-11eb-bf97-00e04c68010e/cameraCalib-2*.log
^C[rosout-1] killing on exit
[master] killing on exit
shutting down processing monitor...
... shutting down processing monitor complete
done

5个畸变纠正参数的顺序问题

在文档2.4 内参结果中:

文档中写到5个畸变纠正参数 k1, k2, p1, p2, k3。

畸变纠正参数的顺序难道不是k1, k2, k3, p1, p2吗?

想知道怎麼從FOV算出image dimension

"The photo dimension needs to be modified according to the FOV(field of view) of LiDAR, the Horizon ’s FOV is 81.7 (horizontal) x 25.1 (vertical), so the photos used in this case have a dimension of 1520 (width) x 568 (height) pixels."

我買了一個 livox mix-40﹐想知道怎麼從它的38.4° Circular FOV算出image dimension. 謝謝。

Using with loam

Hi, what steps do i take to use this data with loam mapping?

Hi, I've got an issue in step 4.2

Hi, I want to ask something in this repo.

In step 4.2, our cursor could get x, y coords in image, but In my case, I cannot get coordiinates. Is there any solutions??

关于摄像机配置

为什么我用这个型号的摄像机采到的图像光线很弱,黑乎乎的。对镜头有指定吗?或者需要额外配什么光源吗?还是说有别的参数需要配置?

运行 roslaunch camera_lidar_calibration cameraCalib.launch的时候报错

你好,我在运行 roslaunch camera_lidar_calibration cameraCalib.launch的时候报错:
ERROR: cannot launch node of type [camera_lidar_calibration/cameraCalib]: Cannot locate node of type [cameraCalib] in package [camera_lidar_calibration]. Make sure file exists in package path and permission is set to executable (chmod +x)

可是我修改了文件的权限并且文件确实存在,为什么会报这个错呢?

Photo dimension modification/cropping

I've seen an other open issue about this, but since there's no answer I ask it again.
You say that "To ensure that the field of view of the photo and the field of LiDAR point cloud are consistent, the photo dimension needs to be modified according to the FOV(field of view) of LiDAR, the Horizon ’s FOV is 81.7 (horizontal) x 25.1 (vertical), so the photos used in this case have a dimension of 1520 (width) x 568 (height) pixels."
So how does it work using, for ex, a livox tele 15 and a camera with max resolution of 3840x2160? Should we consider the camera FOV (in my case FOV 50 * 37)? Is there a basic formula to follow?
I really can't get the point about how to do this scaling/cropping of the camera saved images.
Thank you

ROS noetic support under Ubuntu 20.04 LTS

When using your package under ROS noetic, we repeatedly receive the following error:

[ 97%] Linking CXX executable /ROS_ws/devel/lib/camera_lidar_calibration/projectCloud
[ 97%] Built target projectCloud
make[2]: *** [livox_camera_lidar_calibration/CMakeFiles/colorLidar.dir/build.make:63: livox_camera_lidar_calibration/CMakeFiles/colorLidar.dir/src/color_lidar_display.cpp.o] Error 1
make[1]: *** [CMakeFiles/Makefile2:3554: livox_camera_lidar_calibration/CMakeFiles/colorLidar.dir/all] Error 2
make: *** [Makefile:141: all] Error 2

We manually installed PCL, Eigen and Ceres solver without any issue. All tests were passed.

build livox_camera_lidar_calibration fail, can you help to check it ?

(base) euler@euler-pc:~/Desktop/ws_livox$ catkin_make
Base path: /home/euler/Desktop/ws_livox
Source space: /home/euler/Desktop/ws_livox/src
Build space: /home/euler/Desktop/ws_livox/build
Devel space: /home/euler/Desktop/ws_livox/devel
Install space: /home/euler/Desktop/ws_livox/install

Running command: "make cmake_check_build_system" in "/home/euler/Desktop/ws_livox/build"

Running command: "make -j16 -l16" in "/home/euler/Desktop/ws_livox/build"

[ 1%] Automatic MOC, UIC and RCC for target cornerPhoto
[ 1%] Built target std_msgs_generate_messages_eus
[ 1%] Built target std_msgs_generate_messages_nodejs
[ 3%] Automatic MOC, UIC and RCC for target pcdTransfer
[ 9%] Automatic MOC, UIC and RCC for target colorLidar
[ 9%] Automatic MOC, UIC and RCC for target getExt2
[ 9%] Automatic MOC, UIC and RCC for target getExt1
[ 12%] Automatic MOC, UIC and RCC for target projectCloud
[ 12%] Automatic MOC, UIC and RCC for target cameraCalib
[ 12%] Built target std_msgs_generate_messages_py
[ 12%] Built target std_msgs_generate_messages_lisp
[ 12%] Built target std_msgs_generate_messages_cpp
[ 12%] Built target cornerPhoto_autogen
[ 12%] Built target projectCloud_autogen
[ 12%] Built target getExt2_autogen
[ 12%] Built target getExt1_autogen
[ 12%] Built target cameraCalib_autogen
[ 12%] Built target pcdTransfer_autogen
[ 12%] Built target colorLidar_autogen
[ 12%] Built target _livox_ros_driver_generate_messages_check_deps_CustomMsg
[ 12%] Built target _livox_ros_driver_generate_messages_check_deps_CustomPoint
[ 18%] Built target livox_ros_driver_generate_messages_eus
[ 22%] Built target livox_ros_driver_generate_messages_lisp
[ 25%] Built target livox_ros_driver_generate_messages_cpp
[ 27%] Building CXX object livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/src/cameraCalib.cpp.o
[ 31%] Built target livox_ros_driver_generate_messages_nodejs
[ 37%] Built target cornerPhoto
[ 42%] Built target livox_ros_driver_generate_messages_py
[ 44%] Building CXX object livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/src/projectCloud.cpp.o
[ 50%] Built target getExt1
[ 55%] Built target getExt2
[ 61%] Built target pcdTransfer
[ 66%] Built target colorLidar
[ 66%] Built target livox_ros_driver_generate_messages
[ 92%] Built target livox_ros_driver_node
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/cameraCalib.cpp: In function ‘int main(int, char**)’:
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/cameraCalib.cpp:91:36: error: ‘CV_RGB2GRAY’ was not declared in this scope
cvtColor(imageInput, view_gray, CV_RGB2GRAY); // 转灰度图
^~~~~~~~~~~
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/cameraCalib.cpp:91:36: note: suggested alternative: ‘CV_RGB’
cvtColor(imageInput, view_gray, CV_RGB2GRAY); // 转灰度图
^~~~~~~~~~~
CV_RGB
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/cameraCalib.cpp:98:85: error: ‘CV_TERMCRIT_EPS’ was not declared in this scope
cornerSubPix(view_gray, image_points_buf, Size(5, 5), Size(-1, -1), TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1));
^~~~~~~~~~~~~~~
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/cameraCalib.cpp:98:103: error: ‘CV_TERMCRIT_ITER’ was not declared in this scope
cornerSubPix(view_gray, image_points_buf, Size(5, 5), Size(-1, -1), TermCriteria(CV_TERMCRIT_EPS + CV_TERMCRIT_ITER, 30, 0.1));
^~~~~~~~~~~~~~~~
livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/build.make:62: recipe for target 'livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/src/cameraCalib.cpp.o' failed
make[2]: *** [livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/src/cameraCalib.cpp.o] Error 1
CMakeFiles/Makefile2:601: recipe for target 'livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/all' failed
make[1]: *** [livox_camera_lidar_calibration/CMakeFiles/cameraCalib.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp: In function ‘void loadPointcloudFromROSBag(const string&)’:
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp:52:32: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
if (lidar_datas.size() > (threshold_lidar/24000 + 1)) {
~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp: In function ‘int main(int, char**)’:
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp:196:31: error: ‘CV_WINDOW_KEEPRATIO’ was not declared in this scope
cv::namedWindow("source", CV_WINDOW_KEEPRATIO);
^~~~~~~~~~~~~~~~~~~
/home/euler/Desktop/ws_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp:196:31: note: suggested alternative: ‘CV_MINOR_VERSION’
cv::namedWindow("source", CV_WINDOW_KEEPRATIO);
^~~~~~~~~~~~~~~~~~~
CV_MINOR_VERSION
livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/build.make:62: recipe for target 'livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/src/projectCloud.cpp.o' failed
make[2]: *** [livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/src/projectCloud.cpp.o] Error 1
CMakeFiles/Makefile2:1086: recipe for target 'livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/all' failed
make[1]: *** [livox_camera_lidar_calibration/CMakeFiles/projectCloud.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j16 -l16" failed
(base) euler@euler-pc:~/Desktop/ws_livox$

getExt1 unable to evaluate residual block with Ceres Solver

We have executed your process faithfully, generating eight RGB images with intrinsic / distortion data, picking points on both the images and on the relevant PCD poindclouds. Everything is in order, but the extrinsics cannot be extracted. We are using Ceres 2.0.0 and PCL 1.8.1. Your instructions do not indicate which versions you require.

The following is the console log output:

Terminal Log - Extrinsics Capture

`started roslaunch server http://ubuntu:40985/

SUMMARY

PARAMETERS

  • /error_threshold: 12
  • /extrinsic_path: /home/visionarymi...
  • /init_value: [0.0, -1.0, 0.0, ...
  • /input_lidar_path: /home/visionarymi...
  • /input_photo_path: /home/visionarymi...
  • /intrinsic_path: /home/visionarymi...
  • /rosdistro: melodic
  • /rosversion: 1.14.10

NODES
/
getExt1 (camera_lidar_calibration/getExt1)

auto-starting new master
process[master]: started with pid [21409]
ROS_MASTER_URI=http://localhost:11311

setting /run_id to 40fcc524-5b86-11eb-9025-000c29a7eb2a
process[rosout-1]: started with pid [21420]
started core service [/rosout]
process[getExt1-2]: started with pid [21423]
Get the parameters from the launch file
3.604 0.348 0.127 868 453
3.632 -0.047 0.12 968 450
3.579 0.355 -0.44 866.199 602.522
3.591 -0.04 -0.439 970.089 603.662
3.221 0.954 0.16 680.35 436.59
3.332 0.584 0.149 797.771 440.533
3.205 0.967 -0.413 678.422 607.985
3.309 0.589 -0.424 796.386 602.285
3.535 0.05 0.133 943 449
3.449 -0.324 0.126 1045.36 444.462
3.512 0.057 -0.434 941.998 603.281
3.472 -0.325 -0.441 1046.23 607.426
4.471 0.577 0.089 837.117 471.398
4.61 0.191 0.089 918 472
4.448 0.58 -0.464 841 593
4.483 0.202 -0.471 921 590
4.332 1.24 0.113 697 466
4.449 0.87 0.097 778.264 469.58
4.245 1.215 -0.441 693.186 595.975
0 0 0 778.178 594.362
4.332 1.24 0.113 837.424 429.42
4.449 0.87 0.097 960.897 429.78
4.245 1.215 -0.441 840.903 614.713
0 0 0 962 608
2.864 0.723 0.154 724.223 423.595
3.021 0.371 0.159 850.246 430.882
2.861 0.724 -0.418 725.17 616.304
3.011 0.373 -0.413 850.281 615.028
2.697 0.19 0.166 893.511 414.386
2.648 -0.194 0.164 1032.19 409.463
2.676 0.189 -0.405 894.237 616.112
2.614 -0.198 -0.408 1035.77 618.983
2.203 0.201 0.184 867.476 381.646
2.215 -0.175 0.184 1035.6 380.578
2.174 0.208 -0.38 870.21 632.406
2.202 -0.18 -0.384 1039.56 630.145
2.094 0.333 0.192 799 373
2.264 -0.023 0.186 969.316 384.139
2.075 0.335 -0.38 807.088 635.786
2.239 -0.021 -0.388 972.289 628.854
WARNING: Logging before InitGoogleLogging() is written to STDERR
W0120 17:16:14.738849 21423 residual_block.cc:129]

Error in evaluating the ResidualBlock.

There are two possible reasons. Either the CostFunction did not evaluate and fill all
residual and jacobians that were requested or there was a non-finite value (nan/infinite)
generated during the or jacobian computation.

Residual Block size: 2 parameter blocks x 2 residuals

For each parameter block, the value of the parameters are printed in the first column
and the value of the jacobian under the corresponding residual. If a ParameterBlock was
held constant then the corresponding jacobian is printed as 'Not Computed'. If an entry
of the Jacobian/residual array was requested but was not written to by user code, it is
indicated by 'Uninitialized'. This is an error. Residuals or Jacobian values evaluating
to Inf or NaN is also an error.

Residuals: -nan -nan

Parameter Block 0, size: 4

     0.5 |         -nan         -nan 
    -0.5 |         -nan         -nan 
     0.5 |         -nan         -nan 
     0.5 |         -nan         -nan 

Parameter Block 1, size: 3

       0 |         -nan         -nan 
       0 |         -nan         -nan 
       0 |         -nan         -nan 

E0120 17:16:14.738970 21423 trust_region_minimizer.cc:73] Terminating: Residual and Jacobian evaluation failed.
Ceres Solver Report: Iterations: 0, Initial cost: -1.000000e+00, Final cost: -1.000000e+00, Termination: FAILURE
0 -1 0
0 0 -1
1 0 0
0
0
0
Use the extrinsic result to reproject the data
Data 0 has a error bigger than the threshold
xyz are 3.604 0.348 0.127
ErrorU is 8.04004 errorV is 61.6647


Data 1 has a error bigger than the threshold
xyz are 3.632 -0.047 0.12
ErrorU is 8.23987 errorV is 66.6819


Data 2 has a error bigger than the threshold
xyz are 3.579 0.355 -0.44
ErrorU is 7.43412 errorV is 57.2514


Data 3 has a error bigger than the threshold
xyz are 3.591 -0.04 -0.439
ErrorU is 4.50225 errorV is 55.4791


Data 4 has a error bigger than the threshold
xyz are 3.221 0.954 0.16
ErrorU is 13.0226 errorV is 64.832


Data 5 has a error bigger than the threshold
xyz are 3.332 0.584 0.149
ErrorU is 6.2434 errorV is 65.4356


Data 6 has a error bigger than the threshold
xyz are 3.205 0.967 -0.413
ErrorU is 9.88592 errorV is 57.221


Data 7 has a error bigger than the threshold
xyz are 3.309 0.589 -0.424
ErrorU is 5.13097 errorV is 62.2552


Data 8 has a error bigger than the threshold
xyz are 3.535 0.05 0.133
ErrorU is 8.45544 errorV is 63.4766


Data 9 has a error bigger than the threshold
xyz are 3.449 -0.324 0.126
ErrorU is 4.99999 errorV is 69.0158


Data 10 has a error bigger than the threshold
xyz are 3.512 0.057 -0.434
ErrorU is 7.54875 errorV is 57.0768


Data 11 has a error bigger than the threshold
xyz are 3.472 -0.325 -0.441
ErrorU is 3.8242 errorV is 56.0874


Data 12 has a error bigger than the threshold
xyz are 4.471 0.577 0.089
ErrorU is 9.18836 errorV is 57.3323


Data 13 has a error bigger than the threshold
xyz are 4.61 0.191 0.089
ErrorU is 8.4856 errorV is 57.281


Data 14 has a error bigger than the threshold
xyz are 4.448 0.58 -0.464
ErrorU is 4.07758 errorV is 49.6893


Data 15 has a error bigger than the threshold
xyz are 4.483 0.202 -0.471
ErrorU is 2.16626 errorV is 53.3746


Data 16 has a error bigger than the threshold
xyz are 4.332 1.24 0.113
ErrorU is 5.46783 errorV is 57.0621


Data 17 has a error bigger than the threshold
xyz are 4.449 0.87 0.097
ErrorU is 7.1934 errorV is 57.4105


Data 18 has a error bigger than the threshold
xyz are 4.245 1.215 -0.441
ErrorU is 9.30271 errorV is 46.3202


Data 20 has a error bigger than the threshold
xyz are 4.332 1.24 0.113
ErrorU is 134.956 errorV is 93.6421


Data 21 has a error bigger than the threshold
xyz are 4.449 0.87 0.097
ErrorU is 175.44 errorV is 97.2105


Data 22 has a error bigger than the threshold
xyz are 4.245 1.215 -0.441
ErrorU is 138.414 errorV is 27.5822


Data 24 has a error bigger than the threshold
xyz are 2.864 0.723 0.154
ErrorU is 9.17202 errorV is 74.0685


Data 25 has a error bigger than the threshold
xyz are 3.021 0.371 0.159
ErrorU is 1.77573 errorV is 67.8268


Data 26 has a error bigger than the threshold
xyz are 2.861 0.724 -0.418
ErrorU is 7.66301 errorV is 64.7189


Data 27 has a error bigger than the threshold
xyz are 3.011 0.373 -0.413
ErrorU is 0.759649 errorV is 57.7946


Data 28 has a error bigger than the threshold
xyz are 2.697 0.19 0.166
ErrorU is 6.42211 errorV is 76.1413


Data 29 has a error bigger than the threshold
xyz are 2.648 -0.194 0.164
ErrorU is 0.751157 errorV is 80.7124


Data 30 has a error bigger than the threshold
xyz are 2.676 0.189 -0.405
ErrorU is 5.53217 errorV is 69.7203


Data 31 has a error bigger than the threshold
xyz are 2.614 -0.198 -0.408
ErrorU is 2.05906 errorV is 71.1952


Data 32 has a error bigger than the threshold
xyz are 2.203 0.201 0.184
ErrorU is 13.4323 errorV is 88.7242


Data 33 has a error bigger than the threshold
xyz are 2.215 -0.175 0.184
ErrorU is 1.09482 errorV is 90.2074


Data 34 has a error bigger than the threshold
xyz are 2.174 0.208 -0.38
ErrorU is 6.63821 errorV is 74.9367


Data 35 has a error bigger than the threshold
xyz are 2.202 -0.18 -0.384
ErrorU is 0.360415 errorV is 76.8253


Data 36 has a error bigger than the threshold
xyz are 2.094 0.333 0.192
ErrorU is 19.879 errorV is 89.877


Data 37 has a error bigger than the threshold
xyz are 2.264 -0.023 0.186
ErrorU is 4.37864 errorV is 87.4853


Data 38 has a error bigger than the threshold
xyz are 2.075 0.335 -0.38
ErrorU is 9.57661 errorV is 79.2072


Data 39 has a error bigger than the threshold
xyz are 2.239 -0.021 -0.388
ErrorU is 0.691957 errorV is 77.1114


u average error is: nan
v average error is: nan

The reprojection error bigger than the threshold, extrinsic result seems not ok
[getExt1-2] process has finished cleanly
log file: /home/visionarymind/.ros/log/40fcc524-5b86-11eb-9025-000c29a7eb2a/getExt1-2*.log
`

We see this same problem if we use your provided sample data, so the issue is likely not in the data itself.

中文文档不对

https://github.com/Livox-SDK/livox_camera_lidar_calibration/blob/master/doc_resources/README_cn.md

1.3 Download source code and compile

install camera/lidar calibration package

cd src
git clone https://github.com/Livox-SDK/livox_camera_lidar_calibration.git
cd ..
catkin_make
source devel/setup.bash

1.3 下载源码,编译准备
git clone https://github.com/Livox-SDK/livox_camera_lidar_calibration.git
cd camera_lidar_calibration
catkin_make
source devel/setup.bash

少写一个路径
cd src

坑坑坑

FoV of camera and LiDAR

To ensure that the field of view of the photo and the field of LiDAR point cloud are consistent, the photo dimension needs to be modified according to the FOV(field of view) of LiDAR, the Horizon ’s FOV is 81.7 (horizontal) x 25.1 (vertical), so the photos used in this case have a dimension of 1520 (width) x 568 (height) pixels. Can someone please help me understand how does this work for a custom camera lens of different FoV and resolution.

其他的雷达

想请问一下该方法能否适用于其他的雷达。

can we do it as a real time stream not just using a picture and saved points?

Hi Livox,
Greetings. I used the camera_lidar_calibration with the given existing data, it worked fine. But can we do it real-time? I mean, this thing working on the photos and the bag file, even in the output, projectCloud.launch / colorLidar.launch these files are using the photo and as well as the bag file.
So, is there any option to do it as a video stream?
And another question,
In the getExt1.launch I need to configure the initial rotation matrix & translation vector.
So these extrinsic parameters of camera or lidar?
And also how can I get these initial values?
Thanks.

Compile Error

In file included from /usr/include/c++/5/backward/hash_map:60:0,
                 from /home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/color_lidar_display.cpp:8:
/usr/include/c++/5/backward/backward_warning.h:32:2: warning: #warning This file includes at least one deprecated or antiquated header which may be removed without further notice at a future date. Please use a non-deprecated interface with equivalent functionality instead. For a listing of replacement headers and interfaces, consult the file backward_warning.h. To disable this warning use -Wno-deprecated. [-Wcpp]
 #warning \
  ^
/home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp: In function ‘void loadPointcloudFromROSBag(const string&)’:
/home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/projectCloud.cpp:52:32: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
         if (lidar_datas.size() > (threshold_lidar/24000 + 1)) {
                                ^
In file included from /usr/local/include/ceres/internal/autodiff.h:153:0,
                 from /usr/local/include/ceres/autodiff_cost_function.h:130,
                 from /usr/local/include/ceres/ceres.h:37,
                 from /home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/cam_lid_external1.cpp:3:
/usr/local/include/ceres/jet.h:998:8: error: ‘ScalarBinaryOpTraits’ is not a class template
 struct ScalarBinaryOpTraits<ceres::Jet<T, N>, T, BinaryOp> {
        ^
/usr/local/include/ceres/jet.h:1002:58: error: type/value mismatch at argument 3 in template parameter list for ‘template<class BinaryOp, class T, int N> struct Eigen::ScalarBinaryOpTraits’
 struct ScalarBinaryOpTraits<T, ceres::Jet<T, N>, BinaryOp> {
                                                          ^
/usr/local/include/ceres/jet.h:1002:58: note:   expected a constant of type ‘int’, got ‘BinaryOp’
In file included from /home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/color_lidar_display.cpp:18:0:
/usr/include/pcl-1.7/pcl/visualization/cloud_viewer.h:202:14: warning: ‘template<class> class std::auto_ptr’ is deprecated [-Wdeprecated-declarations]
         std::auto_ptr<CloudViewer_impl> impl_;
              ^
In file included from /usr/include/c++/5/bits/locale_conv.h:41:0,
                 from /usr/include/c++/5/locale:43,
                 from /usr/include/c++/5/iomanip:43,
                 from /usr/include/boost/math/policies/error_handling.hpp:12,
                 from /usr/include/boost/math/special_functions/round.hpp:14,
                 from /opt/ros/kinetic/include/ros/time.h:58,
                 from /opt/ros/kinetic/include/ros/ros.h:38,
                 from /home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/color_lidar_display.cpp:1:
/usr/include/c++/5/bits/unique_ptr.h:49:28: note: declared here
   template<typename> class auto_ptr;
                            ^
/home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/color_lidar_display.cpp: In function ‘void loadPointcloudFromROSBag(const string&)’:
/home/kevintsoi/github/cal_livox/src/livox_camera_lidar_calibration/src/color_lidar_display.cpp:53:32: warning: comparison between signed and unsigned integer expressions [-Wsign-compare]
         if (lidar_datas.size() > threshold_lidar) {

Calculation and U V Error issue.

Hi, I'm trying to get an extrinsic parameters with LIVOX mid-70 & Intel realsense D435i.

I won all the processes before, but now I'm in 5.2

My result of
roslaunch camera_lidar_calibration getExt1.launch
is,,,

xyz are 2.514 0.04 0.801
ErrorU is 2.19356 errorV is 16.2438

I think x, y, z value is right, but the error of U and V is compeletly wrong. I think it has calculation error. how can I solve it?

Hi, I've got an erroer when tring projectCloud.launch and colorLidar.launch

The result of projectCloud was shown as:
2021-12-30 15-40-59 的屏幕截图
The result of colorLidar was shown as:
2021-12-30 15-42-40 的屏幕截图
The extrinsic of getExt1 was:
2021-12-28 20-06-22 的屏幕截图
The extrinsic of getExt2 was:
2021-12-29 20-51-22 的屏幕截图
I used a livox mid 70, whose FOV is 70.4 ° circular field, and the camera was Azure KinectV3. The 4:3 ratio close to the circular field of view was used in shooting, and the picture size was 1024x768 pixels. The camera had been corrected for distortion, so the distortion parameter was 0. The camera matrix obtained using OpencV is as follows:
image
The view of camera and lidar contained the same object, but there was basically no projection points, there was a small part of dense blue area in the upper left corner. Besides, the point cloud was not colored, what was the problem?

解决了吗

    我也有这个问题,ubuntu18.04下,获取角点数据时,不显示坐标,底栏信息,不能进行下一步,

Originally posted by @blackusagi in #32 (comment)

Not able to acquire the corner coordinates in point cloud using pcl_viewer

Hi,
I am trying to extract chessboard corner points from pcd files using pcl viewer, but I'm not able identify the chessboard in the pcl viewer to take the coordinate values.
Can anyone give me tips to extract corner points in point cloud using pcl viewer or any other method?, if possible.

I'm badly stuck at this point from many days so that I can not continue further for calibration.

Any help will be greatly appreciated.

Thanks,
Vivek SM

Can i integrate Livox mid 40 with GoPro Hero Black 8 ?

Hello guys,

i am new in Livox lidars and i would like to ask if i can calibrate lidar livox mid 40 with gopro ? If yes what should i use for camera dimensions if livox mid 40 has FOV circular 38.4 degrees ?

Also can you explain the logic behind the example where you say that "Horizon ’s FOV is 81.7 (horizontal) x 25.1 (vertical), so the photos used in this case have a dimension of 1520 (width) x 568 (height) pixels. "

How can we calculate that ?

Thanks in advance :D

extrinsic result seems not ok--after run roslaunch camera_lidar_calibration getExt1.launch

Hi,
I‘m using camera_lidar_calibration getExt1.launch to get lidar to camera extrinsics . I'm using 10 pairs of pics and pcds , total 40points.
Logs are pasted below. Although CONVERGENCE, the project error is much bigger than threshhold.
Please Help me, what is wrong ?

---------------Logs:
iter cost cost_change |gradient| |step| tr_ratio tr_radius ls_iter iter_time total_time
0 3.263738e+05 0.00e+00 6.46e+05 0.00e+00 0.00e+00 1.00e+04 0 7.46e-03 5.42e-02
1 1.841992e+05 1.42e+05 1.67e+05 1.47e+00 9.54e-01 3.00e+04 1 1.46e-02 6.89e-02
2 1.768749e+05 7.32e+03 5.10e+03 2.50e-01 1.00e+00 9.00e+04 1 4.70e-05 6.90e-02
3 1.768657e+05 9.27e+00 9.36e+00 1.09e-02 1.00e+00 2.70e+05 1 7.41e-05 6.91e-02
Ceres Solver Report: Iterations: 4, Initial cost: 3.263738e+05, Final cost: 1.768657e+05, Termination: CONVERGENCE
-0.721294 -0.692535 0.0122162
-0.00404571 -0.0134298 -0.999907
0.692631 -0.721267 0.0068757
0.264165
-0.00771116
-1.74526
Use the extrinsic result to reproject the data
Data 0 has a error bigger than the threshold
xyz are 5.83209 -6.24977 0.447996
ErrorU is 13.851 errorV is 9.07203

.......


Data 36 has a error bigger than the threshold
xyz are 4.55385 -4.30335 0.334322
ErrorU is 27.3254 errorV is 14.5054


Data 38 has a error bigger than the threshold
xyz are 4.07123 -4.82789 -0.658529
ErrorU is 19.669 errorV is 11.8685


Data 39 has a error bigger than the threshold
xyz are 4.55385 -4.30335 -0.658529
ErrorU is 20.7527 errorV is 20.7066


u average error is: 27.4879
v average error is: 34.0989

The reprojection error bigger than the threshold, extrinsic result seems not ok

roslaunch failed

I have done the configuration as the readme said,but when I excuted roslaunch, an error occured that : RLException: [cameraCalib.launch] is neither a launch file in package [camera_lidar_calibration] nor is [camera_lidar_calibration] a launch file name. So how do I resolve this problem.

mid-70

how to do this with mid-70, in the same way?

bag转成pcd遇到的问题

你好,
我在使用pcdTransfer.launch时遇到了问题。当我输入roslaunch livox_ros_driver livox_lidar_rviz.launch进行查看点云时,显示如第一张图所示。
2022-03-27 19-52-49屏幕截图

     然后我输入roslaunch livox_ros_driver livox_lidar_msg.launch ,在新终端输入rosbag record /livox/lidar 进行录制,得到的bag文件通过pcdTransfer.launch转成了pcd文件,打开后显示为第二张图,发现与第一张图不一致,激光路径中的物体经过转换成pcd后都消失了,请问这是怎么回事呢?

2022-03-27 19-50-23屏幕截图

I want to visulize this points with RGB data

Hi all,
I already finished my Lidar and camera calibration, but I have a question about the visualization.
As we can see in the point in RVIZ, the representation color of the points are consist as RGB. but, in the picture, I got only blue one.
How can I get a RGB points?

rgbd_test_result

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.