-
Notifications
You must be signed in to change notification settings - Fork 433
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Contribute NeighborClassifierFilter #1803
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this mostly looks OK. You should rename processOne to something else, because it conflicts with Stage::processOne and is confusing. It generates warning on some compilers.
Is there any value to preserving the processOne override (which is used in several other filters)? The override is currenly broken b/c of the need to pass the KDIndex and a PointRef that can be used to de-reference the IDs it provides. These could be passed by other means restoring the signature. |
The reason to override is to allow the generic use processOne() by callers. In your case, processOne() doesn't follow the conventions of behavior expected by a stage, so it doesn't make any sense to try to preserve the override. processOne() is used by the pipeline infrastructure to support streaming, but your filter can't do streaming because it needs to calculate a KD-tree before it starts processing points. |
thanks for this clarification. I've changed the name and removed the virtual specifier. |
|
<< function. Good catch. Seem unneeded, will remove. Dimension optional. I agree with your focus on making common use cases easy. But really I think what we're seeing here is that the name wants to be KNNClassifyFilter. I'm inclined to remove "dimension" and change the name. Thoughts? I think the "Candidate" KD Tree code is right. I create the KD Tree on the second (Candidate) set but the IDs still come from the input set. For each point in the input set, we vote based on the NNs found in the Candidate. My specific workflow is that the Candidate has a subset of the input points. There's a unit test that exercises this. Temporary PointRef. Passing a copy of the View is only marginally better than passing a PointRef based on the View. The issue is that the KDIndex should expose some method/attribute that allows it's client to interpret its results. I considered exposing the View from the KDTree ... but thought that too disruptive for a first contribution. I'd be willing to do this if there's concurrence. |
You're correct and I was wrong. I missed that you were passing a PointRef into KDIndex, which binds to the proper PointView when fetching the X/Y to find neighbors. That said, the test assumes that the code is correct, it doesn't really test the algorithm, though it will tell you if a change impacts the current behavior. On the name, perhaps NeighborClassifier or some such? |
Can someone explain to me what is wrong with this? The reported failure from Travis:
I don't believe this link failure from SQLiteWriter has anything to do with the changes to the new NeighborClassifierFilter. |
You're correct, that error has nothing to do with your PR. |
filters/NeighborClassifierFilter.cpp
Outdated
|
||
void KNNAssignFilter::doOneNoDomain(PointRef &point, PointRef &temp, KD3Index &kdi) | ||
{ | ||
std::vector<PointId> iSrc = kdi.neighbors(point, m_k); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
In the case that the query point and the KD3Index are from the same point cloud, neighbors
will return the query point. If that is not the intended behavior, you could detect the situation and increment the number of neighbors accordingly.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Let's see ... I had two use cases in mind:
- A single point cloud where we need to smooth the classifications, so K > 1 (think a few stray mis-classifications)
- Two point clouds, the candidate is completely classified and the source (where the query point comes from) is a superset of those points. We need to extrapolate the classifications, from the smaller candidate to the larger source. Here K == 1.
Could be a later addition if it's useful, but have you also considered defining your neighborhood by radius? |
It would also be good if you could change the name of the class to be consistent with the name of the filter. Do you have any other ideas on names that would better describe the behavior? |
smh uh, ... yeah. that's embarrassing. I'll certainly fix this. Don't tell anyone. ;-). |
From doc/stages/filters.knnassign.rst:
The knnassign filter allows you update the value of a dimension for specific points
to a value determined by a K-nearest neighbors vote. For each point, the k
nearest neighbors are queried and if more than half of them have the same
value for the specified dimension, the filter updates the selected point
accordingly
For example, if an automated classification procedure put/left erroneous
vegetation points near the edges of buildings which were largely classified
correctly, you could try using this filter to fix that problem.
Similiarly, some automated classification processes result in prediction for
only a subset of the original point cloud. This filter could be used to
extrapolate those predictions to the original.