Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add NDT sensor model #338

Merged
merged 8 commits into from
Apr 10, 2024
Merged

Add NDT sensor model #338

merged 8 commits into from
Apr 10, 2024

Conversation

serraramiro1
Copy link
Collaborator

@serraramiro1 serraramiro1 commented Mar 25, 2024

Proposed changes

Adds a new sensor model, based on NDT transforms, that performs importances weighting by transforming the measurements

Type of change

  • 🐛 Bugfix (change which fixes an issue)
  • 🚀 Feature (change which adds functionality)
  • 📚 Documentation (change which fixes or extends documentation)

Checklist

Put an x in the boxes that apply. This is simply a reminder of what we will require before merging your code.

  • Lint and unit tests (if any) pass locally with my changes
  • I have added tests that prove my fix is effective or that my feature works
  • I have added necessary documentation (if appropriate)
  • All commits have been signed for DCO

Additional comments

Anything worth mentioning to the reviewers.

@serraramiro1
Copy link
Collaborator Author

@hidmic PTAL when possible, it's still a draft but might be good to get the review process going.

A few concerns I have.

Citing the NDT MCL paper:

In practice we approximate Eq. 4
by finding the cell from the map that corresponds to the mean
µ¯i = Rkµi + tk and search the local neighborhood in order
to find the closest normal distribution to µ¯i

I'm not doing nearest neighbor search here, but we could preprocess the grid to make that work or modify the conversion grid <-> NDT map script to take that into account, likelihoods fall so abruptly when you go away from the mean that I don't expect it to matter much but we would certainly have some boundary effects with the current approach.

The way they're suggesting the scaling of the likelihood with d1 and d2 I don't love.
Likelihoods are not bounded, and can be any positive number. The current version would favor perfect hits to narrower NDT cells to perfect hits to cells with bigger covariances..

I'd like to try a different version where

weight = unscaled_likelihood_at(measurement) / unscaled_measurement_at(mean)

This way, our weights would have an upper bound of 1 and would make our lives easier.

Also I'd consider using the log likelihood instead of the likelihood to make the weighting function less spiky and prevent us from giving poor scores to very-close hits to a narrow distribution.

@hidmic
Copy link
Collaborator

hidmic commented Mar 25, 2024

I'm not doing nearest neighbor search here, but we could preprocess the grid to make that work or modify the conversion grid <-> NDT map script to take that into account,

IIUC nearest neighbors here is just a performance optimization, so it's OK for a start if we don't do it. I do wonder about the detour in the algorithm from what Sarineen et. al. propose. We are not applying NDT to the measurement like Biber et. al. describe (i.e. discretizing space, fitting normal distributions within each cell) before computing likelihoods. Why?

likelihoods fall so abruptly when you go away from the mean that I don't expect it to matter much but we would certainly have some boundary effects with the current approach.

And that can be problematic, numerically speaking. Working with degenerate, point-mass distributions won't make it any better (like we do here instead of using NDT in full). Even working with NDT may not be enough. Both Biber et. al. and Schulz et. al. suggest having multiple map layers, offseted to deal with boundary effects. We are not doing that either.

The way they're suggesting the scaling of the likelihood with d1 and d2 I don't love.

Yeah, there's some handwaving to those scalings. $d_2$ I can understand though. By artificially inflating covariances you can compensate for the limited sample size issues (i.e. the issues that particle filters have when measurement likelihood distributions are too narrow).

Likelihoods are not bounded, and can be any positive number. The current version would favor perfect hits to narrower NDT cells to perfect hits to cells with bigger covariances..

I don't follow you here.

This way, our weights would have an upper bound of 1 and would make our lives easier.

We do normalize though, typically after reweighting.

Also I'd consider using the log likelihood instead of the likelihood to make the weighting function less spiky and prevent us from giving poor scores to very-close hits to a narrow distribution.

+1. That's something I've been wanting to try for a while. I have a strong suspicion that much of the heuristic that the other sensor models rely on when computing likelihoods can be traced back to numerical issues. Log likelihood and log odds representation may help. I would defer to a different PR though.

Copy link
Collaborator

@hidmic hidmic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Second pass. Overall looks reasonable.

beluga/include/beluga/sensor/ndt_sensor_model.hpp Outdated Show resolved Hide resolved
beluga/include/beluga/sensor/ndt_sensor_model.hpp Outdated Show resolved Hide resolved
beluga/include/beluga/sensor/ndt_sensor_model.hpp Outdated Show resolved Hide resolved
beluga/include/beluga/sensor/ndt_sensor_model.hpp Outdated Show resolved Hide resolved
beluga/test/beluga/sensor/test_ndt_model.cpp Outdated Show resolved Hide resolved
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
@serraramiro1 serraramiro1 marked this pull request as ready for review April 8, 2024 15:48
@serraramiro1 serraramiro1 requested a review from hidmic April 8, 2024 15:48
@serraramiro1
Copy link
Collaborator Author

@hidmic ready to re-review.
Note that I couldn't fully extend this to 3D because of sparse grids being only-2D.
Making them support NDimensional data would involve going back to BaseGrid2 and I rather not go that deep for now.

I'll create a ticket for this effort and leave it here if you're OK with it.

Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
Copy link
Collaborator

@hidmic hidmic left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Overall LGTM. The coverage check is in a weird state.

beluga/test/beluga/sensor/data/test_ndt_cell.cpp Outdated Show resolved Hide resolved
beluga/test/beluga/sensor/data/test_ndt_cell.cpp Outdated Show resolved Hide resolved
beluga/include/beluga/sensor/ndt_sensor_model.hpp Outdated Show resolved Hide resolved
@hidmic hidmic changed the title Add ndt sensor model Add NDT sensor model Apr 10, 2024
Signed-off-by: Ramiro Serra <serraramiro@ekumenlabs.com>
@serraramiro1 serraramiro1 merged commit 33f1573 into main Apr 10, 2024
8 checks passed
@serraramiro1 serraramiro1 deleted the ramiro/ndt-map branch April 10, 2024 17:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants