Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MinkUnet reproduction w/ much larger epoch #112

Closed
kimbss470 opened this issue Aug 22, 2023 · 2 comments
Closed

MinkUnet reproduction w/ much larger epoch #112

kimbss470 opened this issue Aug 22, 2023 · 2 comments

Comments

@kimbss470
Copy link

Hi, Thank you for sharing your work!

When I've tried to train MinkUet@114GMACs model with 100 epochs (other hyperparameters & training configurations are default ones) ,
I got mIoU 65.28% on validation set and 62.95% on test set.

I just wanna know why you didn't use larger epochs even if it incurs better results.

@zhijian-liu
Copy link

A longer training schedule can sometimes lead to improved performance. However, it also takes much longer training time. In our original paper, we intend to keep the same number of training epochs for fair comparisons.

@deeplearning666
Copy link

Hello, When I download the pre-trained model, there is an error, like "Page Not Found. The page you are looking for doesn't exist or has been moved". I can not download the provided model, could you please share the provided model with me?
Thanking you! @kimbss470 @zhijian-liu

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants