-
Notifications
You must be signed in to change notification settings - Fork 54
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Service to get depth from Image Pixel Location using the Lidar #21
Comments
What will the service return if the pixel in the undistorted image is outside of the Velodyne's field of view (only 15 degree vertical field of view) ? Also, if we want to get the range for a large number of pixels at a time, is the overhead of that many service calls worth the convenience? |
Both are good things to think about. We can throw out data that doesn't have a mapping, that's easy. And as far as trying to use the service calls to get the range for a bunch of pixels and then do the smoothing and averaging on the program side, I think it should be handled on the server side. So while the client will call something like Eigen::Vector3f = server.get_depth_from_image(int x, int y) it can assume that the server has done the smoothing and the value returned is safe. So service overhead stays low. |
I'm currently working on this for the right camera. I could probably make it more generic to work for other cameras as well. dev...ironmig:detect-deliver-dev |
Here's my current plan: The service request has the following fields:
The response includes
Returning an array rather then a single point allows for more use cases like finding a normal to a plane (me and @RustyBamboo will be using this in the detect-deliver mission) NOTE: I'm not sure how this would work with stereo @DSsoto, |
This is now available for the right camera as of #81. If it needs to be added for another camera, it can be easily configured for another. |
Once we have found the object we are trying to find in the image we typically want to make some maneuver based on that object. ie, we need to know it's relative location in 3D space relative to the boat.
I will make a service that given a pixel location and camera extrinsic to the boat will use the lidar to tell where the object is. So in the mission system we just use one function to get the distance rather than interacting with the lidar directly.
The text was updated successfully, but these errors were encountered: