Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added support for batch size and learning rate tuning using Ray backend #2024

Merged
merged 13 commits into from
May 12, 2022

Conversation

tgaddair
Copy link
Collaborator

@tgaddair tgaddair commented May 12, 2022

This PR also changes the batch size algorithm to only do exponential increases. It has been observed that in general powers of 2 tend to perform best for maximizing GPU throughput.

Copy link
Contributor

@ShreyaR ShreyaR left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm!

ludwig/backend/ray.py Outdated Show resolved Hide resolved
@github-actions
Copy link

github-actions bot commented May 12, 2022

Unit Test Results

       6 files  ±0         6 suites  ±0   1h 24m 32s ⏱️ - 14m 45s
2 776 tests +2  2 741 ✔️ +2    35 💤 +2  0  - 2 
8 328 runs  +6  8 219 ✔️ +2  109 💤 +6  0  - 2 

Results for commit 6dbf2ce. ± Comparison against base commit b4cb667.

♻️ This comment has been updated with latest results.

@tgaddair tgaddair merged commit 494b2c1 into master May 12, 2022
@tgaddair tgaddair deleted the ray-tune-batch-size branch May 12, 2022 19:27
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants