You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Hi, I am trying to use your work with depth maps generated from monodepth2 depth estimation network.
I tried using one of your kitti samples to test the pipeline, with depth configuration as : conf.depth_mean = [107.0805491, 68.26778312, 133.50751215] conf.depth_std = [38.65614623, 73.59464917, 88.24401221]
Even then, the max score that i get is less than 30 percent.
Depth map and image uploaded for reference.
Is there any configurations I need to change in the monodepth code itself to get this working?
Model used for testing is the pretrained model provided in the repo.
My monodepth post processing code is: _, scaled_depth = disp_to_depth(disp, 0.1, 100) depth_resized = torch.nn.functional.interpolate(scaled_depth, (original_height, original_width), mode="bilinear", align_corners=False) depth_resized_np = depth_resized.squeeze().cpu().numpy() normalizer_depth = mpl.colors.Normalize(vmin=depth_resized_np.min()) mapper_depth = cm.ScalarMappable(norm=normalizer_depth, cmap='gray') colormapped_im_depth = (mapper_depth.to_rgba(depth_resized_np)[:, :, :3] * 255).astype(np.uint8) im_depth = pil.fromarray(colormapped_im_depth)
Hi, I am trying to use your work with depth maps generated from monodepth2 depth estimation network.
I tried using one of your kitti samples to test the pipeline, with depth configuration as :
conf.depth_mean = [107.0805491, 68.26778312, 133.50751215] conf.depth_std = [38.65614623, 73.59464917, 88.24401221]
Even then, the max score that i get is less than 30 percent.
Depth map and image uploaded for reference.
Is there any configurations I need to change in the monodepth code itself to get this working?
Model used for testing is the pretrained model provided in the repo.
My monodepth post processing code is:
_, scaled_depth = disp_to_depth(disp, 0.1, 100)
depth_resized = torch.nn.functional.interpolate(scaled_depth, (original_height, original_width), mode="bilinear", align_corners=False)
depth_resized_np = depth_resized.squeeze().cpu().numpy()
normalizer_depth = mpl.colors.Normalize(vmin=depth_resized_np.min())
mapper_depth = cm.ScalarMappable(norm=normalizer_depth, cmap='gray')
colormapped_im_depth = (mapper_depth.to_rgba(depth_resized_np)[:, :, :3] * 255).astype(np.uint8)
im_depth = pil.fromarray(colormapped_im_depth)
Depth Map:
Image:
Output is :(
Car -1 -1 -0.420610 690.218506 147.377106 1291.588501 441.522003 1.693756 1.490031 3.745631 2.358071 1.346878 1.293291 0.648531 0.281668
Car -1 -1 -0.314688 271.298126 136.579041 803.299255 431.537445 1.713774 1.436281 3.459033 1.215498 1.356887 1.107118 0.517339 0.234700
The text was updated successfully, but these errors were encountered: