-
Notifications
You must be signed in to change notification settings - Fork 7
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
good job! I have some problem about Depth Map Bits #3
Comments
Hi, |
Thank you! Sorry to bother you. |
|
Hi, Sorry for the delayed response. For this, you need to go through the habitat documentation on how depth maps are saved. For example, we used Scene-Net, and for this dataset, we had to multiply the depth maps provided by 0.001 to bring them to the same scale as translation vectors. Let me reiterate - the scale of the depth map is not dependent on the code I've provided. It should be the same scale as the translation vector. This will be defined by the dataset. In other words, if you scale the translation vector and the depth maps by the same factor, the output should not change at all. Ideally, if the depth map is saved without further post-processing as a raw data file (npy for example), you wouldn't need to scale them anymore. If some post-processing is done to save the depth map in a specific format (png for example in scenenet), then you will need to undo the post-processing steps. This will be defined in the dataset. If you're completely in the dark and have no idea what to do, I would suggest the following.
However, if the depth map has some unique post-processing as done on the NeRF-Synthetic dataset, it is very hard to figure out by yourself unless the database owner provides you with the information. |
Thank you for such a detailed answer. |
Sure. Good luck! |
Hello!
Could you please tell me what is the unit of the depth map? I want to import the depth map of my own picture, but it seems that the depth image's units do not match so the results of warp are different.
The text was updated successfully, but these errors were encountered: