Giter Club home page Giter Club logo

local-feature-evaluation's Introduction

Comparative Evaluation of Hand-Crafted and Learned Local Features

This repository contains the instructions and the code for evaluating feature descriptors on our image-based reconstruction benchmark. The details of our local feature benchmark can be found in our paper:

"Comparative Evaluation of Hand-Crafted and Learned Local Features".
J.L. Schönberger, H. Hardmeier, T. Sattler and M. Pollefeys. CVPR 2017.

Paper, Supplementary, Bibtex

You might also be interested in the HPatches benchmark by Balntas and Lenc et al. presented at CVPR 2017.

Benchmark Results

This table lists the latest benchmark results. Note that the results differ from the original paper, since they were updated with the latest COLMAP version. If you want to submit your own results, please open a new issue or pull request for this repository. Note that the below table extends to the right and alternatively can be viewed in a code or text editor.

Metrics:

Dataset Method # Images # Reg. Images # Sparse Points # Observations Track Length Obs. Per Image Reproj. Error [px] # Dense Points Dense Error [2cm] Dense Error [10cm] Mean Pose Error [m] Median Pose Error [m] # Inlier Pairs # Inlier Matches
Fountain SIFT 11 11 14722 70631 4.79765 6421.00 0.392893 292609 55 127734
SIFT-PCA 11 14281 67776 4.74588 6161.45 0.379411 295870 55 117257
DSP-SIFT 11 14867 71153 4.78596 6468.45 0.414944 293789 55 130820
ConvOpt 11 14717 70614 4.79812 6419.45 0.393435 296522 55 127540
TFeat 11 14273 67584 4.73509 6144.00 0.372782 298433 55 113928
LIFT 11 6003 28296 4.71364 2572.36 0.580594 304258 55 52293
Herzjesu SIFT 8 8 7502 31670 4.22154 3958.75 0.431632 241347 28 48965
SIFT-PCA 8 7161 29735 4.15235 3716.87 0.409061 245291 28 44443
DSP-SIFT 8 7769 32809 4.22306 4101.12 0.459535 238122 28 51893
ConvOpt 8 4957 20227 4.08049 2528.37 0.387640 242262 26 27830
TFeat 8 7061 29232 4.13992 3654.00 0.404879 247065 28 43297
LIFT 8 3742 14890 3.97915 1861.25 0.620034 241173 28 22683
South-Building SIFT 128 128 108124 653975 6.04838 5109.18 0.545747 2141964 3822 2036024
SIFT-PCA 128 105612 632145 5.98554 4938.63 0.531500 2090915 3979 1927873
DSP-SIFT 128 112719 666808 5.91566 5209.43 0.580537 2141873 3958 2076833
ConvOpt 128 62306 397579 6.38107 3106.08 0.487924 2117221 1901 984762
TFeat 128 102143 604357 5.91677 4721.53 0.510260 2089004 4342 1751327
LIFT 128 42601 233110 5.47193 1821.17 0.730874 2154755 2830 711142
Madrid Metropolis SIFT 1344 500 116088 733745 6.32053 1467.49 0.605330 1822434 227092 6969437
SIFT-PCA 469 111090 645437 5.81003 1376.19 0.586054 1571584 644573 13970478
DSP-SIFT 467 99514 649704 6.52877 1391.22 0.660135 1643614 135215 4586807
ConvOpt 348 40749 213176 5.23144 612.57 0.534638 1251705 665669 12531539
TFeat 435 102775 574980 5.59455 1321.79 0.566243 1536760 712501 15207011
LIFT 416 44056 303055 6.87885 728.497 0.768777 1577304 82562 2531640
Gendarmenmarkt SIFT 1463 1035 338972 1872308 5.52348 1809.00 0.699118 4225031 321854 12625310
SIFT-PCA 975 349217 1690464 4.84072 1733.80 0.701904 3649260 822997 20321433
DSP-SIFT 979 293209 1577921 5.38155 1611.76 0.749714 2600189 265575 9315075
ConvOpt 772 178859 694211 3.88133 899.23 0.723822 2955105 811724 15583270
TFeat 902 280233 1324931 4.72796 1468.88 0.695517 3384513 655181 15040928
LIFT 959 142982 819940 5.73456 854.99 0.841945 3939957 125084 5012767
Tower of London SIFT 1576 804 239951 1863301 7.76534 2317.53 0.615406 3050252 165097 11249925
SIFT-PCA 693 220381 1491686 6.76866 2152.50 0.602057 2518677 558173 14605601
DSP-SIFT 799 267906 1940752 7.24415 2428.97 0.655440 2946702 260963 12750104
ConvOpt 537 143397 788855 5.50119 1469.00 0.580207 2448215 742322 14648025
TFeat 675 255666 1605322 6.27898 2378.25 0.580068 2583560 926517 21742783
LIFT 713 96848 739340 7.63402 1036.94 0.728200 2879455 60841 3628677
Alamo SIFT 2915 963 198433 2437084 12.28164 2530.72 0.647271 3737516 64068 21263831
SIFT-PCA 921 197723 2279339 11.52791 2474.85 0.626812 3256364 143747 20145150
DSP-SIFT 961 223192 2564659 11.49082 2668.73 0.712005 3815012 79973 23375984
ConvOpt 684 110261 1167754 10.59081 1707.24 0.537849 2546861 168383 8065721
TFeat 865 180730 2040775 11.29184 2359.27 0.609598 2973035 192115 16518550
LIFT 796 78892 1011117 12.816471 1270.24 0.768177 2900266 40219 8151208
Roman Forum SIFT 2364 1679 433152 3603662 8.31962 2146.31 0.708420 9630170 76547 16424472
SIFT-PCA 1663 434317 3267075 7.52232 1964.56 0.674920 9379870 151694 15134227
DSP-SIFT 1644 464792 3653745 7.86103 2222.47 0.749306 9429283 100827 16469792
ConvOpt 1282 182922 1263324 6.90635 985.43 0.627904 7404163 158940 6151296
TFeat 1603 401965 2897537 7.20843 1807.57 0.647753 9096825 180301 12869235
LIFT 1503 174430 1420800 8.14538 945.30 0.814467 8584480 49413 5775222
Cornell SIFT 6514 6073 1847141 12865681 6.96518 2118.50 0.660522 35232209 227478 61428156
SIFT-PCA 6010 1856258 12307131 6.63007 2047.77 0.643796 35263104 417668 59874790
DSP-SIFT 6069 2071407 13671952 6.60032 2252.75 0.708143 35449395 283503 64364585
ConvOpt 5009 938316 6082683 6.48255 1214.35 0.570824 30619302 353461 25017605
TFeat 5779 1730263 11292717 6.52659 1954.09 0.622775 33917778 489447 55385797
LIFT 5518 739059 4602081 6.22694 834.01 0.730208 33372173 143408 19144270

Runtime:

Method Runtime Hardware
SIFT 9.3s (Intel E5-2697 2.60GHz CPU - single-threaded)
SIFT-PCA 10.5s (Intel E5-2697 2.60GHz CPU - single-threaded)
DSP-SIFT 23.7s (Intel E5-2697 2.60GHz CPU - single-threaded)
ConvOpt 49.9s (Intel E5-2697 2.60GHz CPU, NVIDIA Titan X GPU)
TFeat 11.8s (Intel E5-2697 2.60GHz CPU, NVIDIA Titan X GPU)
LIFT 212.3s (Intel E5-2697 2.60GHz CPU, NVIDIA Titan X GPU)

References:

  • SIFT: D.G. Lowe: Object Recognition from Local Scale-Invariant Features. ICCV, 1999. R. Arandjelovic and A. Zisserman. Three things everyone should know to improve object retrieval. CVPR, 2012.
  • SIFT-PCA: A. Bursuc, G. Tolias, and H. Jegou. Kernel local descriptors with implicit rotation matching. ACM Multimedia, 2015.
  • DSP-SIFT: J.Dong and S.Soatto. Domain-size pooling in local descriptors: DSP-SIFT. CVPR, 2015.
  • ConvOpt: K. Simonyan, A. Vedaldi, and A. Zisserman. Learning local feature descriptors using convex optimisation. PAMI, 2014.
  • TFeat: V.Balntas, E.Riba, D.Ponsa, and K.Mikolajczyk. Learning local feature descriptors with triplets and shallow convolutional neural networks. BMVC, 2016.
  • LIFT: M. Kwang, E. Trulls, V. Lepetit, and P. Fua. LIFT: Learned Invariant Feature Transform. ECCV, 2016.

local-feature-evaluation's People

Contributors

ahojnnes avatar sarlinpe avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

local-feature-evaluation's Issues

colmap_import sqlite3 error

Hi,

I was able to follow all steps from the instructions but am having a problem importing features/matches into colmap.

(I am using my own detector/descriptor, but I made sure the writing format is correct, by reading with the provided MATLAB functions and similarly by checking that the read_matrix function from colmap_import.py reads the features properly).

When running

python scripts/colmap_import.py --dataset_path path/to/Fountain

I get

Importing features for 0010.png
Traceback (most recent call last):
  File "scripts/colmap_import.py", line 94, in <module>
    main()
  File "scripts/colmap_import.py", line 65, in main
    memoryview(keypoints)))
sqlite3.InterfaceError: Error binding parameter 3 - probably unsupported type.

I have investigated quite a bit, but still cannot figure out what the problem is. It looks like the issue is with memoryview(keypoints). Could this potentially be a python version problem? (are you using python 2.7?)

Another thing I am thinking is: is the command (ie, the one that ends in line 65 of colmap_import.py) trying to just write the address of keypoints into data? I am not familiar with SQL, so apologies if this is a dumb question. If this is the case, maybe I could somehow cast the memory address returned by memoryview(keypoints) into an int before passing it into the cursor.execute call?

BTW, when I print as print(memoryview(keypoints)) (right before the error), I get: <memory at 0x7f39516b4478> -- does this look like what this should be?

Thanks for the help!

colmap building error: Could not find a package configuration file provided by "Ceres"

Hi,
when I am building the "colmap", I met the problem below:
CMake Error at CMakeLists.txt:87 (find_package):
By not providing "FindCeres.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "Ceres", but
CMake did not find one.

Could not find a package configuration file provided by "Ceres" with any of
the following names:

CeresConfig.cmake
ceres-config.cmake

Add the installation prefix of "Ceres" to CMAKE_PREFIX_PATH or set
"Ceres_DIR" to a directory containing one of the above files. If "Ceres"
provides a separate development package or SDK, be sure it has been
installed.

-- Configuring incomplete, errors occurred!
How to solve it?

Make error: camera.cc.o and src/CMakeFiles/colmap.dir/all.... seems to related to Ceres solver...

I have received the following error.

As I looked at the details, it seems that the error is related to the command lines of Ceres Solver which I have just recently updated (please see the figure below).

Therefore, I guess that your developed package is relying on a certain Ceres's commit (about 2 years ago...).

Can you share the commit id of the Ceres solver, so I can checkout the commit and works with your repo.?

src/CMakeFiles/colmap.dir/build.make:128: recipe for target 'src/CMakeFiles/colmap.dir/base/camera.cc.o' failed
make[2]: *** [src/CMakeFiles/colmap.dir/base/camera.cc.o] Error 1
CMakeFiles/Makefile2:853: recipe for target 'src/CMakeFiles/colmap.dir/all' failed
make[1]: *** [src/CMakeFiles/colmap.dir/all] Error 2
Makefile:149: recipe for target 'all' failed
make: *** [all] Error 2

Screenshot from 2020-06-18 10-43-20

Screenshot from 2020-06-18 10-43-31

Extract the statistics

I am not sure to understand how to extract the statistics after dense reconstruction like in INSTRUCTION.md
image

Can anyone show me how to do step by step? And if I use COLMAP GUI to reconstruct the sparse or dense model, how can I extract the statistics? Thank you.

no such table: two_view_geometries

Traceback (most recent call last):
File "scripts/reconstruction_pipeline.py", line 304, in
main()
File "scripts/reconstruction_pipeline.py", line 270, in main
matching_stats = import_matches(args)
File "scripts/reconstruction_pipeline.py", line 144, in import_matches
cursor.execute("SELECT count(*) FROM two_view_geometries WHERE rows > 0;")
sqlite3.OperationalError: no such table: two_view_geometries.

I've finished reconstruction_pipeline.py but got this error._

approximate_matching : matlab error at [status, output] = system(command);

Hello. I have tried to use the approximate_matching. However, I constantly get the error at system(command) in Matlab (Matlab 2018b, Ubuntu 18.04)..

[status, output] = system(command);

The error is shown as follows:

86 if num_images < 2 % 2000

Running command: /home/userxx/colmap/build/src/tools/vocab_tree_retriever_float
Running command:  --database_path 

Running command: /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/database.db
Running command:  --descriptor_path 
Running command: /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/descriptors
Running command:  --vocab_tree_path 
Running command: /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/Oxford5k/vocab-tree.bin
Error using matlab.system.internal.executeCommand
Arguments must contain a character vector.
Error in dos (line 66)
[varargout{1:nargout}] = matlab.system.internal.executeCommand(varargin{:});

Error in approximate_matching (line 11)
[status, output] = system(command);

Error in matching_pipeline_file (line 90)
        approximate_matching

Because Oxford5k is quite large, so I tried running this by using Fountain dataset (by setting the number of the image to a low number such as 2).

PS. Also, before running this code ("approximate_matching.m"), I have executed the following two colmap command (according to the instruction https://github.com/ahojnnes/local-feature-evaluation/blob/master/INSTRUCTIONS.md) which seems to work fine...

./colmap/build/src/tools/vocab_tree_builder_float  
    --descriptor_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/descriptors  
    --database_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/database.db 
    --vocab_tree_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/Oxford5k/vocab-tree.bin 

And

./colmap/build/src/tools/vocab_tree_retriever_float \
    --descriptor_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/descriptors  
    --database_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/database.db 
    --vocab_tree_path /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/Oxford5k/vocab-tree.bin
    > /Fountain/d2_netD2-Net-nophototourism-FP32-detThr0-max-global_ApproxMatch_Scr1p00_Mtch0p95_TESTFORApprox/retrieval.txt   

Results:

Loading descriptors...
  => Loaded a total of 190278 descriptors
Building index for visual words...
 => Quantized descriptor space using 64516 visual words
Saving index to file... 

missing vocab_tree_retriever_float/vocab_tree_builder_float

It seems like colmap does not have vocab_tree_retriever_float.cc or vocab_tree_builder_float.cc, as described in the current version of the instructions when using large datasets.

I could only find versions without the "float" notation (eg, I can find vocab_tree_retriever but not vocab_tree_retriever_float).

I suppose one can use the non-float ones, but I would guess the performance would suffer a bit since the non-float ones perform quantization, while the "float" ones mentioned in the instructions probably perform no quantization (this is my guess).

extremely slow running time of Gendarmenmarkt dataset

I followed the instruction to reproduce the statistic result on the Gendarmenmarkt dataset. After conducting feature matching, it comes to
python scripts/reconstruction_pipeline.py
--dataset_path path/to/Fountain
--colmap_path path/to/colmap/build/src/exe
This script has already run over 3 days, so I think there might be something wrong?
I checked the task manager, and found that two threads are running:
~/colmap/build/src/exe/dense_fuser --workspace_path ~/data/Gendarmenmarkt/dense/0 --input_type photometric --output_path ~/data/Gendarmenmarkt/dense/0
~/colmap/build/src/exe/dense_fuser --workspace_path ~/data/Gendarmenmarkt/dense/0 --input_type photometric --output_path ~/data/Gendarmenmarkt/dense/0/fused.

Is this normal? Otherwise, how can I accelerate the process?

about matches_importer

I don not have this execuate './colmap/build/src/exe/matches_importer ',and how to get it?

Error using pdist2mex: X and Y inputs to PDIST2MEX must both be double, or both be single

I'm trying to run this benchmark using databases Herzjesu and Fountain but got the following error on both:

Computing features for 0000.png [1/11] -> skipping, already exist
Computing features for 0006.png [7/11] -> skipping, already exist
Computing features for 0004.png [5/11] -> skipping, already exist
Computing features for 0009.png [10/11] -> skipping, already exist
Computing features for 0001.png [2/11] -> skipping, already exist
Computing features for 0008.png [9/11] -> skipping, already exist
Computing features for 0003.png [4/11] -> skipping, already exist
Computing features for 0005.png [6/11] -> skipping, already exist
Computing features for 0010.png [11/11] -> skipping, already exist
Computing features for 0007.png [8/11] -> skipping, already exist
Computing features for 0002.png [3/11] -> skipping, already exist
Matching block [1/1, 1/1]Warning: Converting non-floating point data to single. 
> In pdist2 (line 228)
  In match_descriptors (line 18)
  In exhaustive_matching (line 57)
  In matching_pipeline (line 81) 
Error using pdist2mex
X and Y inputs to PDIST2MEX must both be double, or both be single.

Error in pdist2 (line 352)
        D = pdist2mex(X',Y',dist,additionalArg,smallestLargestFlag,radius);

Error in match_descriptors (line 18)
dists = pdist2(descriptors1, descriptors2, 'euclidean');

Error in exhaustive_matching (line 57)
                    matches = match_descriptors(descriptors(oidx1), ...

Error in matching_pipeline (line 81)
    exhaustive_matching

I've installed all Dependencies, downloaded Databases keypoints and configured TODO Variables in matching_pipeline.m script. Any ideia about this problem? Best regards.

how to generate 'database.db' by using 'matching_pipeline.m'

Hello, I think it misses some files on this project. After running matching_pipeline.m (it misses pdist2.m to compute eculidean distance of descriptors, I used my own), I obtained some files in 'descriptors', 'keypoints', 'matches' as you described in the instruction. However, I can not continue the evaluation by using the python script 'reconstruction_pipeline.py', as it requires to input the database file, which I haven't find after running the matching_pipeline. I do found there is 'DATABASE_PATH' defined in the 'matching_pipeline.m', but I didn't find where it is used. Can you help me to fix the problem?

Adding new results to benchmark

Hi Dr. Schönberger,

We ran this benchmark for our new descriptor R2D2, and a pretrained version of SuperPoint. Should we add the results to the readme ?

Cheers,
Noe Pion from Naver Labs Europe

make error (base/feature.h)

Hi, Thanks for your work!

I followed your instruction, and almost everything went well. However, when I did

cd colmap/build
cmake .. -DTEST_ENABLED=OFF
make -j8'

In make -j8, an error ocuured as in

[ 93%] Building CXX object src/tools/CMakeFiles/vocab_tree_retriever_float.dir/vocab_tree_retriever_float.cc.o
[ 95%] Built target inverted_file_entry_test
/home/phantom/projects/local-feature-evaluation/colmap/src/tools/vocab_tree_retriever_float.cc:17:26: fatal error: base/feature.h: No such file or directory
compilation terminated.
src/tools/CMakeFiles/vocab_tree_retriever_float.dir/build.make:62: recipe for target 'src/tools/CMakeFiles/vocab_tree_retriever_float.dir/vocab_tree_retriever_float.cc.o' failed
make[2]: *** [src/tools/CMakeFiles/vocab_tree_retriever_float.dir/vocab_tree_retriever_float.cc.o] Error 1
CMakeFiles/Makefile2:6243: recipe for target 'src/tools/CMakeFiles/vocab_tree_retriever_float.dir/all' failed
make[1]: *** [src/tools/CMakeFiles/vocab_tree_retriever_float.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....

For information, there was no feature.h in colmap/src/base/.

Could you please help with this?

RANSAC parameters for geometric verification

First of all thank you for your excellent work. I was wondering if you could provide your RANSAC configuration for evaluation? I didn't find them either in the original paper or this repo, or are you using the default configuration? I'm asking this because I find default configuration fails to discard many visually unreliable inlier image pairs, and very strangely it seems to me 'min_inlier_ratio' is not working at all - pairs that should be discarded by this ratio test still appear in the inlier pair list. Is it a COLMAP issue or am I doing something wrong?

Thank you very much.

error using reconstruction_pipeline.py : no such filr or directory:matches_importer

When i use the command:

python3 scripts/reconstruction_pipeline.py --dataset_path data/Fountain/ --colmap_path ../colmap/build/src/exe

to eval the descriptor,error occurs:

Traceback (most recent call last):
File "scripts/reconstruction_pipeline.py", line 277, in
main()
File "scripts/reconstruction_pipeline.py", line 243, in main
matching_stats = import_matches(args)
File "scripts/reconstruction_pipeline.py", line 114, in import_matches
"--match_type", "pairs"])
File "/usr/lib/python3.5/subprocess.py", line 557, in call
with Popen(*popenargs, **kwargs) as p:
File "/usr/lib/python3.5/subprocess.py", line 947, in init
restore_signals, start_new_session)
File "/usr/lib/python3.5/subprocess.py", line 1551, in _execute_child
raise child_exception_type(errno_num, err_msg)
FileNotFoundError: [Errno 2] No such file or directory: '../colmap/build/src/exe/matches_importer'

But i can use the command "colmap matches_importer --dataset_path data/Fountain --match_list_path data/Fountain/image-pairs.txt" correctly.

How to fix it? Looking forward to your apply!

Cannot reproduce result of sift for madrid metropolis

Hi,
I could reproduce the result on Fountain, Herz-jesu, South-Building but cannot reproduce the result for Madrid Metropolis. The number of inliers-pairs and inlier -matches are way too many with lots of false positives. Is it a problem with inlier matching code?

question about matches_importer

Hi,
I am testing my descriptor on the benchmark, but I meet a problem when import feature matches to colmap
the error on Fountain is

==============================================================================
Custom feature matching
==============================================================================

Matching block [1/1]F1011 02:58:27.932655 3451297 database.cc:153] Check failed: matrix.size() * sizeof(typename MatrixType::Scalar) == num_bytes (44408 vs. 88816) 
*** Check failure stack trace: ***
    @     0x7fddebffc1c3  google::LogMessage::Fail()
    @     0x7fddec00125b  google::LogMessage::SendToLog()
    @     0x7fddebffbebf  google::LogMessage::Flush()
    @     0x7fddebffc6ef  google::LogMessageFatal::~LogMessageFatal()
    @     0x55bb5c715889  colmap::(anonymous namespace)::ReadDynamicMatrixBlob<>()
    @     0x55bb5c716972  colmap::Database::ReadMatches()
    @     0x55bb5c7ae0d5  colmap::FeatureMatcherCache::GetMatches()
    @     0x55bb5c7b26aa  colmap::SiftFeatureMatcher::Match()
    @     0x55bb5c7b7363  colmap::ImagePairsFeatureMatcher::Run()
    @     0x55bb5c94ca51  colmap::Thread::RunFunc()
    @     0x7fddea3f1de4  (unknown)
    @     0x7fddea517609  start_thread
    @     0x7fddea09d293  clone
/home/kunbpc/Installed/colmap/build/src/exe/colmap: /usr/local/lib/libtiff.so.5: no version information available (required by /lib/x86_64-linux-gnu/libfreeimage.so.3)
ERROR: Failed to parse options - unrecognised option '--export_path'.
Warning: Could not reconstruct any model

and my code is

subprocess.call([paths.colmap_path,
                    "matches_importer",
                     "--database_path",
                     paths.database_path,
                     "--match_list_path",
                     paths.match_list_path,
                     "--match_type", "pairs"])

I have checked the numpy shape of my matches, it is (x,2), and I apply exhaustive matching by a nested for loop, which is

    image_names = list(images.keys())
    image_pairs = []
    image_pair_ids = set()
    for idx_total, image_name1 in enumerate(tqdm(image_names[:-1])):
        # read descriptors for image1
        feature_path1 = paths.features_path/'{}.{}_local'.format(image_name1, configs['method_postfix'])
        descriptors1 = np.load(feature_path1)['descriptors']
        descriptors1 = torch.from_numpy(descriptors1).to(device)
        bar = tqdm(image_names[idx_total+1:])
        for idx_sub, image_name2 in enumerate(bar):
            image_pairs.append((image_name1, image_name2))
            image_id1, image_id2 = images[image_name1], images[image_name2]
            image_pair_id = image_ids_to_pair_id(image_id1, image_id2)
            if image_pair_id in image_pair_ids:
                continue

            # read descriptors for image2
            feature_path2 = paths.features_path/'{}.{}_local'.format(image_name2, configs['method_postfix'])
            descriptors2 = np.load(feature_path2)['descriptors']
            descriptors2 = torch.from_numpy(descriptors2).to(device)

            matches = matcher(descriptors1, descriptors2, **configs['matcher_config'])
            image_pair_ids.add(image_pair_id)
            if image_id1 > image_id2:
                matches = matches[:, [1, 0]]

            matches_str = matches.tostring()
            cursor.execute("INSERT INTO  matches(pair_id, rows, cols, data) "
                       "VALUES(?, ?, ?, ?);",
                       (image_pair_id, matches.shape[0], matches.shape[1],
                        matches_str))
            connection.commit()

The error code said there is something half the size of that it should be, but I have no idea about what it is and what's wrong with it.
I also submitted the same issue on colmap repository.

Issue while trying to add a custom dataset

Hello,

I'm trying to add custom datasets to your benchmark. My goal is to determine which method (SIFT, ...) is best for my specific application.

To do so, I tried to generate the file database.db associated with custom dataset. However, i can not figure out what params in the table cameras should contain (apparently it is binary data). Since i had no idea which kind of data i had to put here, i set NULL for every camera_id.

I generated descriptors, keypoints and matches successfully thanks to scripts/matching_pipeline.m (this script do not use database). However, when I try to run scripts/reconstruction_pipeline.py on my custom dataset, i get the following errors:

==============================================================================
Loading database
==============================================================================

Loading cameras...F0917 10:12:14.569028  5185 database.cc:207] Check failed: num_params == camera.NumParams() (0 vs. 4) 
*** Check failure stack trace: ***
    @     0x7fd04f1b25cd  google::LogMessage::Fail()
    @     0x7fd04f1b4433  google::LogMessage::SendToLog()
    @     0x7fd04f1b215b  google::LogMessage::Flush()
    @     0x7fd04f1b4e1e  google::LogMessageFatal::~LogMessageFatal()
    @           0x6426a9  colmap::(anonymous namespace)::ReadCameraRow()
    @           0x647cd2  colmap::Database::ReadAllCameras()
    @           0x64b87d  colmap::DatabaseCache::Load()
    @           0x6ad367  colmap::IncrementalMapperController::LoadDatabase()
    @           0x6b0fba  colmap::IncrementalMapperController::Run()
    @           0x7f4eec  colmap::Thread::RunFunc()
    @     0x7fd04bb79c80  (unknown)
    @     0x7fd04e3a86ba  start_thread
    @     0x7fd04b2df41d  clone
    @              (nil)  (unknown)
Warning: Could not reconstruct any model

==============================================================================
Raw statistics
==============================================================================
{'num_inlier_matches': None, 'num_images': 43, 'num_inlier_pairs': 0}
None

==============================================================================
Formatted statistics
==============================================================================
Traceback (most recent call last):
  File "scripts/reconstruction_pipeline.py", line 297, in <module>
    main()
  File "scripts/reconstruction_pipeline.py", line 293, in main
    matching_stats["num_inlier_matches"]])) + " |")
TypeError: 'NoneType' object is not subscriptable

Do you have any idea what i did wrong here? Or anything i should consider to add a custom dataset to your benchmark?

how should I do?

CMake Error at CMakeLists.txt:87 (find_package):
By not providing "FindCeres.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "Ceres", but
CMake did not find one.

Could not find a package configuration file provided by "Ceres" with any of
the following names:

CeresConfig.cmake
ceres-config.cmake

Add the installation prefix of "Ceres" to CMAKE_PREFIX_PATH or set
"Ceres_DIR" to a directory containing one of the above files. If "Ceres"
provides a separate development package or SDK, be sure it has been
installed.

-- Configuring incomplete, errors occurred!
See also "/home/wsw/文档/local-feature-evaluation-master/colmap/build/CMakeFiles/CMakeOutput.log".

dist ratio and patch radius

Hi, thanks for your code!
After reading the code and your paper, I have some questions.

  1. You wrote in your paper that you do not enforce the ratio test by pruning descriptors whose top-ranked nearest neighbors are very similar. However, in you code, the MATCH_MAX_DIST_RATIO is set to 0.8. If I'm not wrong, we should set it to 1.0 ?
  2. In your code, the PATCH_RADIUS is set to 32. However, the feature_extraction_tfeat.py use patches31. I wonder if I should resize the generated patches to 31 pixels or I should set the radius to 15 pixels?
    @ahojnnes
    Thanks for your help!

Full python implementation?

Thanks so much for leasing this evaluation metrics. I just wonder if you have any full python implementation for the evaluation.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.