Can I use getDistsQuantile(1.0) to get the final distance? #125

kwill opened this Issue Oct 9, 2015 · 6 comments


None yet

2 participants

kwill commented Oct 9, 2015

I am looking for a way to measure how "well" a registration attempt fits. My supervisor mentioned the Hausdorff distance, a sum of the squared distances between closest points, which sounds an awful lot like ICP! :)

Before I reimplement the wheel, am I correct that calling icp(data, ref) then calling getDistsQuantile(1.0) after calling icp(data, ref) will give me the same (or a very similar) results?

My reading is that every ICP object contains a Matches object and, after registration, the Matches object contains all the closest-point pairings and their respective distances, and getDistsQuantile(1.0) will return the summed distance for every pair. Is this correct?

pomerlef commented Oct 9, 2015

In your question, you say:

[...] Hausdorff distance, a sum of the squared distances between closest points, [...]

From what I understand from the wiki entry, it's the maximum distance of the set of closest points. If you define the distance metric as being the Squared Euclidean distance, than yes, getDistsQuantile(1.0) should give you what you're looking for.

Few notes:

  • Hausdorff distance won't be symmetric unless you match A -> B along with B -> A and then take the max of both getDistsQuantile(1.0)
  • In practice, this distance is not very robust to large outlier because it relies on the max function. This paper seems to compare robust version of robust Hausdorff distance.
  • To make sense, you need to compare two point clouds that represent the same object (i.e., the expected overlap should be 100%). It might be the case for your application, but it's not the case for 3D mapping for example.
kwill commented Oct 9, 2015

Thanks for the confirmation, and the reference! My clouds are of the same object (or should be, hence the need for a test) - our scenario is matching a reference to an object on a tabletop.

It sounds like it's worth trying getDistsQuantile(1.0) as a starting point, and if the detection rate isn't good enough for our scenario, creating a new method(s) for other alternative distance measure(s).

@kwill kwill added a commit to kwill/libpointmatcher that referenced this issue Oct 11, 2015
@kwill kwill * calculate and store squared distance from final ICP iteration (cal…
…ls matches.getDistsQuantile(1.0))

 * make this value accessible from the icp object
 * initialise to -1 (i.e. invalid before ICP is called)
 * see ethz-asl#125
@kwill kwill added a commit to kwill/libpointmatcher that referenced this issue Oct 11, 2015
@kwill kwill * add final definition for getFinalSquaredDistance - see ethz-asl#125
 * give this experimental branch a semver-compliant version string
@kwill kwill added a commit to kwill/libpointmatcher that referenced this issue Oct 11, 2015
@kwill kwill finalSquaredDistance should be a T, not unsigned - see ethz-asl#125 a388ef9
kwill commented Oct 11, 2015

I've implemented this as outlined below. In my ICP "scoring" filter I have tried to maximise the number of points considered in the Euclidean distance calculation, and make the result non-deterministic by performing a brute force comparison.

However, the results for the same reading-reference pair are quite variable. Is there anything else I can do in the ICP filter to:

a) consider all possible points (*) for the Euclidean distance calculation
b) make the result more consistent?

(*) The reference cloud has 7000-20000 points, the reading cloud has 250-600 points.

In ICP.cpp (pseudocode)

void computeWithTransformedReference() {
    // icp logic

    while (iterate) {
        // icp loop logic

        if (!iterate) // on last iteration, store squared distance
            finalSquaredDistance = getDistsQuantile(1.0);

    // more icp logic
    return transformationParameters;

From outside (pseudocode):

\\ attempt an ICP registration
icp = new PM.ICP();
icp.loadFromYaml("default.yaml"); \\ this filter voxelises to account for the resolution differences
result = icp(reading, reference);

\\ get transformed reference as pointcloud
newReference = new DataPoints(reference.features, reference.featureList);
icp.transformations.apply(newReference, result);

\\ use identity filter to score reading against transformed reference
scoringICP = new PM.ICP();
scoringICP(reading, newReference);
float score1 = scoringICP.getFinalSquaredDistance();
scoringICP(newReference, reading);
float score2 = scoringICP.getFinalSquaredDistance();
float result = max(score1, score2);

Identity filter ("identity.yaml" above). Single iteration of this filter should return the squared euclidean distance between the closest points in the two clouds.

  - IdentityDataPointsFilter

  - IdentityDataPointsFilter

    knn: 40
    epsilon: 0
    searchType: 0

  - NullOutlierFilter


  - CounterTransformationChecker:
      maxIterationCount: 1



(I have also tried knn = 2147483647 and epsilon = inf)


I'm not sure what do you mean by quite variable. I added an example of how I would do it in icp_advance_api.cpp. See the section //START demo 1 in the code. You can run it with the example data sets:

./examples/icp_advance_api  ../examples/data/car_cloud401.csv ../examples/data/car_cloud400.csv


match ratio: 0.850013

Haussdorff distance: 21.5567 m
Haussdorff quantile distance: 0.325167 m
Robust mean distance: 0.0900166 m

writing to test_ref.vtk
writing to test_data_in.vtk
writing to test_data_out.vtk
ICP transformation:
   0.981308   -0.153002    0.116726  -0.0523912
   0.170089    0.973296   -0.154154    -0.21568
 -0.0900234    0.171127    0.981127 -0.00762224
          0           0           0           1

One mistake that I could see in your config file is that you use knn: 40. You want the closest point for the Hausdorff (knn: 1). If you increase knn, the maximum distance will always be larger because you are adding the second, third, 40th closest point to the list of paired points.

epsilon: inf seems an interesting concept too. It would mean that you do not do backtracking in the kd-tree search. It should be fast but not accurate.


@kwill: How is that going? Can I close this issue?

kwill commented Nov 5, 2015

@pomerlef Thanks for the feedback. Because of my data structures we ended up implementing Haussdorf in our C# layer instead. But your tips on filters were very useful, and I'll reference your example as a "future work" suggestion.

@kwill kwill closed this Nov 5, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment