Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update default inference batch size #432

Merged
merged 5 commits into from
Jun 3, 2024
Merged

Update default inference batch size #432

merged 5 commits into from
Jun 3, 2024

Conversation

adamltyson
Copy link
Member

@adamltyson adamltyson commented May 31, 2024

This PR:

  • Updates the default inference batch size to 64. This is only based on my own local experiments that this seems to have the best compromise of speed and RAM/VRAM usage. It seems to speed classification up by approx 40% (depending on the number of points per plane).
  • Exposes the batch size parameter in the napari GUI so users can change this if needed.
  • Prints timing of classification (as with detection) to help assess performance.

I've left the training batch size alone because training requires more memory, and the batch size actually affects training (unlike inference).

Closes #353

@adamltyson adamltyson requested a review from a team May 31, 2024 17:47
@adamltyson adamltyson changed the title Update batch size Update default inference batch size May 31, 2024
Copy link
Member

@IgorTatarnikov IgorTatarnikov left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Looks good, I tested this on Ubuntu with an RTX 4080, a batch size of 64 was ~50% faster for inference when compared to 16.

@adamltyson adamltyson merged commit 01f53e5 into main Jun 3, 2024
16 checks passed
@adamltyson adamltyson deleted the batch-size branch June 3, 2024 10:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[Feature] Need an intelligent way of picking batch size.
2 participants