-
Notifications
You must be signed in to change notification settings - Fork 156
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Short depth camera's focal length ? #63
Comments
That data is, unfortunately, not available. You have to use |
Hi @alexsyx , @FracturedShader has provided a good guideline for getting the 3D point cloud. Basically you need to use the unprojection mapping. To make it more specific, I will provide a long_throw_depth example about getting the 3D point cloud.
In the binary file, it basically saves (u, v) of the unit plane. Suppose we have an image where the pixel coordinate of top-left is Knowing the way that unprojection mapping is saved, we can read the mapping (u,v) now.
Here is how u and v look like. Now, we can try to get 3D points. Suppose we already have a depth map, Z, (-ve values) , we can get the 3D points by (Optional) Depth map issues Hope this helps! |
Update: fixed the problem, It was matlab cruelly rounding my u,v matrices' values. I am still curious about intrinsics parameters, if i wanted to retrieve them could i just use the 2D-3D correspondence i found? Hi everyone, i am trying to recover the intrinsics parameters for the depth sensor and i have been following the instruction on @Huangying-Zhan's answer. I used the suggested python code to recover u and v matrices and then wrote a little matlab function to get the points cloud from the depth frame.
Here is the matlab function, did i do something wrong?
Also, I understand from your answers that the only way to get the intrinsics parameters of the camera is to use u and v to find the 3D points and then compute the intrinsics matrix from the 3D - 2D correnspondence, is that right? |
If you are using short throw depth data, you should take data in the range of 200 to 1000 as valid. After you hidden the invalid ones, maybe you can get right 3D point cloud of near scene like your hand. |
@Liebewill please upgrade to the latest Windows Version on HoloLens and checkout the latest commits of this repository. There was an incompatibility with the latest update vs. usage of the API in this repository. It should be fixed now. |
Hi all, I already have the point cloud in the frame coordinate system. I got this from using the (.bin) projection*(-depth)/1000 |
@mauronano How did you solve the problem with the messed-up point cloud? I'm having the same issue but i don't think it's a rounding problem, at least in my case. Do you have some hints? I can post the code if necessary. Edit: I think I found out what's happening. In my original picture there are a lot of reflective surfaces (like 2 monitors and a whiteboard) which maybe messed up the depth map. I'm just guessing as I'm a beginner in this field. |
@vitcozzolino, you are correct. In addition to reflective surfaces, anything with a black material/coating does not capture well either (as it absorbs infrared). |
… description in issue microsoft#63. Precisely: microsoft#63 (comment)
@mauronano how did get the depth from long_throw_depth.csv ?there are too many features in the csv file (which one ?) |
@cyrineee you don't get the depth from the .csv file. If you run the recording app you get the depth data in the form of grayscale images arranged in folders like in the picture, every pixel of these images is a measure of the distance of some "obstacle" in the depth camera field of view. Each depth map (i.e. each grayscale image) comes with a timestamp which u can then look up to in the csv file to get additional information on each particular frame (like the orientation of the camera when the frame was shot and some other camera parameters) |
@mauronano Thaaaaaaaaaaaaaaaaaaanks a lot for your explanations ! Thanks in advance ! |
Hi all,
I can get the depth image from HoloLens, but i need to convert it to 3D point cloud for other purposes,
thus i want get the focal_x, focal_y, and u0, v0?
Anyone can tell me the value of those parameters,or any method?
The text was updated successfully, but these errors were encountered: