Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Cant achieve the best miou when batchsize=4 #25

Closed
cocolord opened this issue Apr 24, 2021 · 6 comments
Closed

Cant achieve the best miou when batchsize=4 #25

cocolord opened this issue Apr 24, 2021 · 6 comments

Comments

@cocolord
Copy link

I cannt achieve the most miou proposed in original paper while i didnt change the hyperparameter except tuning the bs from 8 to 4. How to achieve the best miou? Is that related to batchsize?

@lzrobots
Copy link
Contributor

Yes you need batch size 8. How much did you get with bs=4?

@cocolord
Copy link
Author

I tried Naive Deit and i have 77.64 miou after 80000iter.

@cocolord
Copy link
Author

Basically 1% miou fall behond. But I dont have 8 cards, Is there any ways to train better result based on four cards?

@lzrobots
Copy link
Contributor

not bad. you can increase the crop size

@zbhit123
Copy link

I have four 2080Ti cards, 12G memory each
Can i run this code ? tks !!!@cocolord

@cocolord
Copy link
Author

I have four 2080Ti cards, 12G memory each
Can i run this code ? tks !!!@cocolord

you can infer with the small vit or deit model

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants