Giter Club home page Giter Club logo

puma's Introduction

Poisson Surface Reconstruction for LiDAR Odometry and Mapping

Surfels TSDF Our Approach
suma tsdf puma

Table: Qualitative comparison between the different mapping techniques for sequence 00 of the KITTI odometry benchmark.

This repository implements the algorithms described in our paper Poisson Surface Reconstruction for LiDAR Odometry and Mapping.

This is a LiDAR Odometry and Mapping pipeline that uses the Poisson Surface Reconstruction algorithm to build the map as a triangular mesh.

We propose a novel frame-to-mesh registration algorithm where we compute the poses of the vehicle by estimating the 6 degrees of freedom of the LiDAR. To achieve this, we project each scan to the triangular mesh by computing the ray-to-triangle intersections between each point in the input scan and the map mesh. We accelerate this ray-casting technique using a python wrapper of the Intel® Embree library.

The main application of our research is intended for autonomous driving vehicles.

Table of Contents

Running the code

NOTE: All the commands assume you are working on this shared workspace, therefore, first cd apps/ before running anything.

Requirements: Install docker

If you plan to use our docker container you only need to install docker and docker-compose.

If you don't want to use docker and install puma locally you might want to visit the Installation Instructions

Datasets

First, you need to indicate where are all your datasets, for doing so just:

export DATASETS=<full-path-to-datasets-location>

This env variable is shared between the docker container and your host system(in a read-only fashion).

So far we've only tested our approach on the KITTI Odometry benchmark dataset and the Mai city dataset. Both datasets are using a 64-beam Velodyne like LiDAR.

Building the apss docker container

This container is in charge of running the apss and needs to be built with your user and group id (so you can share files). Building this container is straightforward thanks to the provided Makefile:

make

If you want' to inspect the image you can get an interactive shell by running make run, but it's not mandatory.

Converting from .bin to .ply

All our apps use the PLY which is also binary but has much better support than just raw binary files. Therefore, you will need to convert all your data before running any of the apps available in this repo.

docker-compose run --rm apps bash -c '\
    ./data_conversion/bin2ply.py \
    --dataset $DATASETS/kitti-odometry/dataset/ \
    --out_dir ./data/kitti-odometry/ply/ \
    --sequence 07
    '

Please change the --dataset option to point to where you have the KITTI dataset.

Running the puma pipeline

Go grab a coffee/mate, this will take some time...

docker-compose run --rm apps bash -c '\
    ./pipelines/slam/puma_pipeline.py  \
    --dataset ./data/kitti-odometry/ply \
    --sequence 07 \
    --n_scans 40
    '

Inspecting the results

The pipelines/slam/puma_pipeline.py will generate 3 files on your host sytem:

results
├── kitti-odometry_07_depth_10_cropped_p2l_raycasting.ply # <- Generated Model
├── kitti-odometry_07_depth_10_cropped_p2l_raycasting.txt # <- Estimated poses
└── kitti-odometry_07_depth_10_cropped_p2l_raycasting.yml # <- Configuration

You can open the .ply with Open3D, Meshlab, CloudCompare, or the tool you like the most.

Where to go next

If you already installed puma then it's time to look for the standalone apps. These apps are executable command line interfaces (CLI) to interact with the core puma code:

├── data_conversion
│   ├── bin2bag.py
│   ├── kitti2ply.py
│   ├── ply2bin.py
│   └── ros2ply.py
├── pipelines
│   ├── mapping
│   │   ├── build_gt_cloud.py
│   │   ├── build_gt_mesh_incremental.py
│   │   └── build_gt_mesh.py
│   ├── odometry
│   │   ├── icp_frame_2_frame.py
│   │   ├── icp_frame_2_map.py
│   │   └── icp_frame_2_mesh.py
│   └── slam
│       └── puma_pipeline.py
└── run_poisson.py

All the apps should have an usable command line interface, so if you need help you only need to pass the --help flag to the app you wish to use. For example let's see the help message of the data conversion app bin2ply.py used above:

Usage: bin2ply.py [OPTIONS]

  Utility script to convert from the binary form found in the KITTI odometry
  dataset to .ply files. The intensity value for each measurement is encoded
  in the color channel of the output PointCloud.

  If a given sequence it's specified then it assumes you have a clean copy
  of the KITTI odometry benchmark, because it uses pykitti. If you only have
  a folder with just .bin files the script will most likely fail.

  If no sequence is specified then it blindly reads all the *.bin file in
  the specified dataset directory

Options:
  -d, --dataset PATH   Location of the KITTI dataset  [default:
                       /home/ivizzo/data/kitti-odometry/dataset/]

  -o, --out_dir PATH   Where to store the results  [default:
                       /home/ivizzo/data/kitti-odometry/ply/]

  -s, --sequence TEXT  Sequence number
  --use_intensity      Encode the intensity value in the color channel
  --help               Show this message and exit.

Citation

If you use this library for any academic work, please cite the original paper.

@inproceedings{vizzo2021icra,
author    = {I. Vizzo and X. Chen and N. Chebrolu and J. Behley and C. Stachniss},
title     = {{Poisson Surface Reconstruction for LiDAR Odometry and Mapping}},
booktitle = {Proc.~of the IEEE Intl.~Conf.~on Robotics \& Automation (ICRA)},
codeurl   = {https://github.com/PRBonn/puma/},
year      = 2021,
}

puma's People

Contributors

nachovizzo avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

puma's Issues

Questions for the pointcloud Poisson reconstruction.

Thanks for your great work and open code!

In your code, you accumulate 30 scans of point cloud and then perform Poisson reconstruction. I want to know if I can do it with a scan of point cloud. Or in other words, can I get a good Poisson reconstruction surface using a scan of point cloud? If you have any advices for this question, I will be very grateful!

Thanks in advance!

Accurate mapping while localizing

Hi @nachovizzo,

Thank you for your awesome work and open source code!

From the paper and the code I think is is possible to use PUMA in online localization mode (as mention in the Section IV-E: "Registration Algorithm")
However, I am wondering if it is possible to create an aligned map at the same time and continue mapping even if there is not anymore a map available.
Does PUMA already do that?
If you have any advice for this question, I will be very grateful!

Thanks in advance!

How to run puma on Newer College dataset?

Thanks for your work!

I'm running puma on Newer College dataset, but I can't get "complete" mesh results:

I first run with the default config set in puma.yml, but the result seems to be incomplete:
snapshot01

The Ground Truth is like this:
snapshot00

I try to set min_density to 0.01, but noting changes.

And I try with 0 min_density, the result become much more strange:
snapshot00

Another question is about the tracking result. When puma processes on this dataset, it will lose track, here is the tracking result:
image
The green line is the GT pose, and the blue line is the tracking result.

I change the processing frame gap to 2, meaning that puma processes every two frame (e.g. 00001.pcd, 00003.pcd, 00005.pcd .......), the tracking runs works well:

image

What should I do to get a correct and complete result?

missing config file

Hi, thanks for your great work.
I have successfully compiled puma locally on my server. I find there is no config file in this repo. Since it plays an important role in possion reconstruction, I wonder how I can find an example of config. Thanks.
@nachovizzo

Metrics calculation source code

Good afternoon!
Thank you very much for publishing the source code! You have done a great job and proposed a very interesting SLAM approach.
I am particularly interested in your approach to map quality evaluation, it seems to me it is very promising. I would like to reproduce your metrics and evaluate other mapping algorithms.
Could you please share the source code for map quality estimation?

which version of python and ubuntu should I use

1.Should I use python 3.6.9 like ubuntu18 default to run your skript.I have noticed that in the embree.sh you have used "pip install" so I just wonder whether to use python2 or python3.
2.Another problem is the trimesh can not detect embree engine even if I have run the embree.sh and installed pyembree successfully.I just can not resolve the api of "trimesh.ray.has_embree", I can also not found it in the official API document and source code. Can I just comment and ignore it?

Using docker, PermissionError: [Errno 13] Permission denied: './data/kitti-odometry' ERROR: 1

Dear authors,
Thank you for your birlliant work. I am trying to use it based on docker, (linux ubuntu18.04). I set the $DATASETS as followed:

ruanjy@ruanjy:~/Workspace/noRosCode_ws/puma/apps$ echo $DATASETS
/home/ruanjy/HDD/dataset_all/kitti_odometry/data_odometry_velodyne

and when I run (change the path a liitle) :

ruanjy@ruanjy:~/Workspace/noRosCode_ws/puma$ docker-compose run --rm apps bash -c '\
>     ./data_conversion/bin2ply.py \
>     --dataset $DATASETS/dataset/ \
>     --out_dir ./data/kitti-odometry/ply/ \
>     --sequence 07
>     '

I get follwing error:

Creating puma_apps_run ... done
Converting .bin scans into .ply fromat from:/data//dataset/ to:./data/kitti-odometry/ply/
Traceback (most recent call last):
  File "./data_conversion/bin2ply.py", line 111, in <module>
    main()
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "./data_conversion/bin2ply.py", line 86, in main
    os.makedirs(out_dir, exist_ok=True)
  File "/usr/lib/python3.8/os.py", line 213, in makedirs
    makedirs(head, exist_ok=exist_ok)
  File "/usr/lib/python3.8/os.py", line 223, in makedirs
    mkdir(name, mode)
PermissionError: [Errno 13] Permission denied: './data/kitti-odometry'
ERROR: 1

I find that I can write something into the folder /data with root, but I fail to run the command again with 'sudo', I think that is because the &DATASETS will be reset. When I switch to root by 'su', I get the same error as before.
I check my user-id and group id are 1000.

Can you help me to figure out the problem? Thank you very much!

ERROR:139

Hello there,

I encountered ERROR: 139 while running puma_pipeline.py on my own datasets collected from gazebo.

[scan #180] Running PSR over local_map: 58%|██▉ [10:49<09:44, 4.50s/ scans][WARNING] /root/Open3D/build/poisson/src/ext_poisson/PoissonRecon/Src/FEMTree.Initialize.inl (Line 193)
Initialize
Found bad data: 675972
ERROR: 139

If I only reconstruct on previous frames, it works. Is there a problem with the data at frame scan#180? What are the specific issues and how to avoid?

pip install --user . with Error: No module named 'pip' & No module named 'pybind11'

When running "pip install --user . ". I got these error messages, has anyone met this before?

(occu) zlq@qh1:/mnt3/zlq/code/puma$ pip install pybind11
Requirement already satisfied: pybind11 in /home/zlq/anaconda3/envs/occu/lib/python3.7/site-packages (2.10.0)

(occu) zlq@qh1:/mnt3/zlq/code/puma$ pip install --user .
Processing /mnt3/zlq/code/puma
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
Getting requirements to build wheel did not run successfully.
exit code: 1
> [27 lines of output]
      Traceback (most recent call last):
        File "/home/zlq/anaconda3/envs/occu/bin/pip", line 5, in <module>
          from pip._internal.cli.main import main
      ModuleNotFoundError: No module named 'pip'
      Traceback (most recent call last):
        File "/home/zlq/anaconda3/envs/occu/bin/pip", line 5, in <module>
          from pip._internal.cli.main import main
      ModuleNotFoundError: No module named 'pip'
      256
      256
      Traceback (most recent call last):
        File "/home/zlq/anaconda3/envs/occu/lib/python3.7/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 363, in <module>
          main()
        File "/home/zlq/anaconda3/envs/occu/lib/python3.7/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 345, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/zlq/anaconda3/envs/occu/lib/python3.7/site-packages/pip/_vendor/pep517/in_process/_in_process.py", line 130, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-9rghrs16/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 338, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=['wheel'])
        File "/tmp/pip-build-env-9rghrs16/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 320, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-9rghrs16/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 483, in run_setup
          self).run_setup(setup_script=setup_script)
        File "/tmp/pip-build-env-9rghrs16/overlay/lib/python3.7/site-packages/setuptools/build_meta.py", line 335, in run_setup
          exec(code, locals())
        File "<string>", line 8, in <module>
      ModuleNotFoundError: No module named 'pybind11'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

Getting requirements to build wheel did not run successfully.
exit code: 1
See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Failed to run puma_pipepline on our datasets.

Hello there,

I have successfully run the demo on docker. But when I try to run it on two of our datasets, it would have the same error running the puma_pipeline. The error information are as follows:

Creating puma_apps_run ... done
Results will be saved to results/kitti-odometry_07_depth_10_cropped_p2l_raycasting.txt
[scan #30] Running PSR over local_map:  74%|█████████████████████▌       [00:04<00:01,  6.67 scans/s]
Traceback (most recent call last):
  File "./pipelines/slam/puma_pipeline.py", line 176, in <module>
    main()
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 829, in __call__
    return self.main(*args, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 782, in main
    rv = self.invoke(ctx)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 1066, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/usr/local/lib/python3.8/dist-packages/click/core.py", line 610, in invoke
    return callback(*args, **kwargs)
  File "./pipelines/slam/puma_pipeline.py", line 156, in main
    mesh, _ = create_mesh_from_map(
  File "/usr/local/lib/python3.8/dist-packages/puma/mesh/poisson.py", line 46, in create_mesh_from_map
    return run_poisson(pcd, depth, n_threads, min_density)
  File "/usr/local/lib/python3.8/dist-packages/puma/mesh/poisson.py", line 37, in run_poisson
    vertices_to_remove = densities < np.quantile(densities, min_density)
  File "<__array_function__ internals>", line 5, in quantile
  File "/usr/local/lib/python3.8/dist-packages/numpy/lib/function_base.py", line 3930, in quantile
    return _quantile_unchecked(
  File "/usr/local/lib/python3.8/dist-packages/numpy/lib/function_base.py", line 3937, in _quantile_unchecked
    r, k = _ureduce(a, func=_quantile_ureduce_func, q=q, axis=axis, out=out,
  File "/usr/local/lib/python3.8/dist-packages/numpy/lib/function_base.py", line 3515, in _ureduce
    r = func(a, **kwargs)
  File "/usr/local/lib/python3.8/dist-packages/numpy/lib/function_base.py", line 4050, in _quantile_ureduce_func
    n = np.isnan(ap[-1])
IndexError: index -1 is out of bounds for axis 0 with size 0
ERROR: 1

question about build_gt_cloud.py

I want to generate the gt point cloud of the street and compare it with the reconstructed model. But when I execute the following code, there are some errors. If you can, please help solve the following problem. I will appreciate it very much!
docker-compose run --rm apps bash -c '\ ./pipelines/mapping/build_gt_cloud.py \ --dataset ./data/kitti-odometry/ply \ --sequence 07 \ --n_scans 40 '
a579f9a863da140b98bfe4452722b1e
If this method doesn't work, can you tell me how to generate street gt point cloud?

no module " pybind11"

thanks for your excellent work.
when i run this command" docker-compose up --build builder",i get this error message in step 9/16
`Successfully installed easydict-1.10 pykitti-0.3.1 trimesh-3.20.2
WARNING: Running pip as the 'root' user can result in broken permissions and conflicting behaviour with the system package manager. It is recommended to use a virtual environment instead: https://pip.pypa.io/warnings/venv
Processing /puma
Installing build dependencies: started
Installing build dependencies: finished with status 'done'
Getting requirements to build wheel: started
Getting requirements to build wheel: finished with status 'error'
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [17 lines of output]
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in
main()
File "/usr/local/lib/python3.8/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
File "/usr/local/lib/python3.8/dist-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
File "/tmp/pip-build-env-e_smm85x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 338, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=['wheel'])
File "/tmp/pip-build-env-e_smm85x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 320, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-e_smm85x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 484, in run_setup
super(_BuildMetaLegacyBackend,
File "/tmp/pip-build-env-e_smm85x/overlay/lib/python3.8/site-packages/setuptools/build_meta.py", line 335, in run_setup
exec(code, locals())
File "", line 3, in
ModuleNotFoundError: No module named 'pybind11'
[end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Service 'builder' failed to build: The command '/bin/sh -c cd /puma && pip install --no-cache-dir --requirement requirements.txt && pip install --no-cache-dir --upgrade . && rm -rf /puma' returned a non-zero code: 1
`

Open3D with GeneralizedICP support not installed, check INSTALL.md

INFO - 2021-09-29 22:58:32,090 - utils - Note: NumExpr detected 12 cores but "NUMEXPR_MAX_THREADS" not set, so enforcing safe limit of 8.
INFO - 2021-09-29 22:58:32,090 - utils - NumExpr defaulting to 8 threads.
Open3D with GeneralizedICP support not installed, check INSTALL.md

I have installed GeneralizedICP suceessfully.why ?

Couldn't connect to Docker daemon at http+docker://localunixsocket - is it running?

Hello,I have installed docker and docker compose according to the documentation. In the third step, I try to building the apss docker container. I entered 'make' command in the terminal, But error occur as follow:
2021-06-20 21-50-07屏幕截图
I entered 'sudo docker-compose' command in the terminal to solve this problem. But error occur as follow:
2021-06-20 21-57-06屏幕截图
So I come here to search for your help. Thank you.

The speed gradually slows down!

Hello, this job looks very good. So I ran it successfully with a few minor modifications to fit my environment, but it seems that the speed of processing a frame of point cloud is getting slower. Is the reason that every frame will match all the previous meshs?

Faulty results for mai city dataset

Hello,

I am trying to use this method to generate a mesh out of the mai city dataset. Sadly, the algorithm introduces artifacts and poorly aligned geometry resulting in overlapping walls and objects. Is this due to wrong configuration or because of limitations of the method?

Thanks in advance for any help!

My configuration:
2023-12-22_17-55

The resulting mesh:
2023-12-22_17-55_1

‘make’ Errors

Thank you much for your great work!
And when I try to enter ‘make’ in the terminal, there was something happened as fellows. Can you help me find out what's wrong? I will appreciate it very much!Sincerely waiting for your reply!

20220330112623

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.