Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

pre-training code #1

Closed
Huiimin5 opened this issue Mar 27, 2023 · 3 comments
Closed

pre-training code #1

Huiimin5 opened this issue Mar 27, 2023 · 3 comments

Comments

@Huiimin5
Copy link

Dear author,

Thank you for your interesting contribution. Do you mind sharing the pre-training code, too? 

Many thanks

@jerome-revaud
Copy link
Contributor

Hi @Huiimin5
We are currently trying to get the internal authorization to release the pre-training code.
We'll let you know when this happens (hopefully soon? i.e. in the next months).

@Huiimin5
Copy link
Author

Huiimin5 commented Mar 30, 2023

Thank you for your responce.

I notice several SOTA methods' results you reported in your paper are not consistent with those in their papers. The depth estimation results on NYUv2 of MAE and MultiMAE can be as high as 85.1 and 86.4, respectively. (https://github.com/EPFL-VILAB/MultiMAE). But in Table 3 of your paper, these numbers are 79.6 and 83.0 , respectively. I am really confused about where this difference comes from.

@PhilippeWeinzaepfel
Copy link

PhilippeWeinzaepfel commented Apr 17, 2023

see #2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants