Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Experimental results using the NYU dataset #48

Closed
yuyu19970716 opened this issue May 11, 2022 · 11 comments
Closed

Experimental results using the NYU dataset #48

yuyu19970716 opened this issue May 11, 2022 · 11 comments

Comments

@yuyu19970716
Copy link

Hello, I am still training on PENet with NYU dataset, please help me to take a look. The third one in this graph is the predicted result, right? I think this also proves that this network can run on the NYU dataset, is that correct? Because I want to know if this network is suitable for a densely labeled dataset, thanks. Looking forward to your recovery.
comparison_best

@JUGGHM
Copy link
Owner

JUGGHM commented May 11, 2022

I cannot assert whether your results are normal since I don't save the experimental results on NYUDepth V2 dataset currently. But if you have reached a RMSE around 105mm, than it seems going on well.

@yuyu19970716
Copy link
Author

Thank you very much for your reply again! Because I watched PENet for a long time, and the teacher later asked me to use the indoor data set to complete the project. The paper of this network only used the KITTI data set for experiments, and I have many novel ideas about this network, so I especially want to to try. So I have a lot of questions, please!

I think one of the questions I want to ask the most is: Does PENet work on indoor datasets?

The second major problem: the RMSE of the current training is large. But I don't think the network can't predict the NYU depth map. I think this has something to do with the unit of NYU and KITTI depth values. I see that the depth value 'd' and 'gt' of KITTI in the code is in the style of Figure 1,2, while the 'd' in NYU is in the style of Figure 3,4

KITTI 'd'
KITTI 'gt'
NYU 'd
NYU 'gt'
If the KITTI depth value is in m, what is the NYU depth value in? I think both m and mm will not work here, is it cm? Or I have a problem converting the NYU dataset format. Do you know what the unit of the NYU depth map is? I think the RMSE value displayed now is very large, the problem is related to the unit of the depth value of each pixel in the dataset. I don't know if what I said is clear.

Excuse me! ! !

@JUGGHM
Copy link
Owner

JUGGHM commented May 11, 2022

I am not certain now because I didn't save the experimental records. But one thing I can ensure is that ENet works on NYU Depth V2, though we don't officially release the codes and results. And we follow similar experimental setting (like NLSPN) on the indoor dataset. You could check your settings and codes more detailly.

@yuyu19970716
Copy link
Author

Thank you so much for confirming this to me! In this way, I think I don't have to change the network hahahaha! Thank you for your patience in answering!
I think I understand what you mean: use NLSPN to extract the NYU data set to write dataloader to obtain image data, I think that the extraction method of that network is the H5 file format.
I am now changing the format of the NYU dataset to the format of the KITTI dataset that PENet can input: I have made the following modifications, and I would like you to help me see if the changes I have made are correct, so that I can rule out the large error. s reason:

  1. I changed the size of the NYU dataset to 1242375(the original is 640480)
  2. At this time, the rgb of the NYU dataset is 24 bits, and the raw depth and gt are both 16 bits (the original is 8 bits, and the PENet format requires 16 bits of input data, so I made changes here)
  3. Modify the internal parameter matrix K
    The above is my modification of the dataset format that PENet needs! I think you could haven't modified it this way, but I want to check with you if it's reasonable for me to do so.
    I will always look forward to your reply! Thank you for not taking my trouble!

@yuyu19970716
Copy link
Author

image
This is the RMSE of my first round of training ENet, which is ten times larger than the KITTI dataset, but the KITTI dataset is a few hundred for training ENet from the beginning.

@yuyu19970716
Copy link
Author

10bde4fb6be2be027df20846a8239ce
Now I suspect that the NYU dataset is not the real depth value after changing to 16 bits, how can I get the real depth distance?

@yuyu19970716
Copy link
Author

But I still want to ask, can you provide the code for using the NYU dataset on PENet? I will be very grateful!

@JUGGHM
Copy link
Owner

JUGGHM commented May 13, 2022

But I still want to ask, can you provide the code for using the NYU dataset on PENet? I will be very grateful!

I conducted my experiments with the NYU Depth V2 dataloaders uploaded previously in your issues as '.txt' files. You could remove the 'txt' post-fix and add them to the dataloader directory. However we might probably not formally update codes and results on NYUDepth V2 currently.

@yuyu19970716
Copy link
Author

Ok, thank you very much for your patient reply again!
So to say. This method of changing the format of the NYU dataset to the format of the KITTI dataset as the input to PENet does not work. I need to use other method to read NYU dataset. Is that what you mean?

@JUGGHM
Copy link
Owner

JUGGHM commented May 13, 2022

Ok, thank you very much for your patient reply again! So to say. This method of changing the format of the NYU dataset to the format of the KITTI dataset as the input to PENet does not work. I need to use other method to read NYU dataset. Is that what you mean?

I think it is not necessary to upsample NYU figures into KITTI size. But I am not sure what point you are talking about.

@yuyu19970716
Copy link
Author

I think I know what's wrong. I now want to experiment with KITTI first and solve that problem at the same time.
Thank you so much for your patient reply!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants