You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi, yes, it is definitely possible. basically, you reconstruct 3d points from depth data provided by the depth sensor frames. the sdk here has all the api's you need for this process. you can refer to this issue post (#64) as starting point.
this is possible and not so difficult. But you need to know how. There are several important things to consider. The short (long) throw depth don´t give you the real depth value. It gives you the distance from the clipping plane to the points. You need to perform a unprojection mapping using the u and v matrices.
Is it possible to create a colored point cloud from the obtained data?
If you can, could you please give us some tips on how to do this?
The text was updated successfully, but these errors were encountered: