Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for granular downscaling #263

Merged
merged 3 commits into from Sep 30, 2015

Conversation

Sazpaimon
Copy link
Contributor

Much like #258, add granular downscaling as well. Only supports downscaling on consumed vs provisioned capacity, as I can't imagine a situation where one would want to scale DOWN when they have throttling.

With this kind of setup a user could, for example, only scale down slightly when at a certain point, but also immediately scale all the way down to 1 read/write unit when they have 0% utilization. This would prevent a situation like described in #245 where the autoscaler would only downscale very few units each iteration, risking bumping up against the downscale limit.

I'll work on the docs if this looks good to you.

On a side note, I think a lot of this code could be consolidated together, as I performed more copy/paste of large chunks of code than I'd care to admit, but I figured that refactoring would be out of the scope of this feature. I'd like to visit it after this and my other PRs are merged, though.

@sebdah sebdah self-assigned this Sep 11, 2015
@sebdah sebdah added this to the 1.20.x milestone Sep 11, 2015
@sebdah sebdah modified the milestones: 1.21.x, 1.20.x Sep 11, 2015
@sebdah
Copy link
Owner

sebdah commented Sep 11, 2015

Thanks for the PR. I'm gonna look it through and likely include it in the 2.1.0 release.

@sebdah sebdah added this to the 2.1.x milestone Sep 11, 2015
sebdah added a commit that referenced this pull request Sep 30, 2015
@sebdah sebdah merged commit 07d23db into sebdah:master Sep 30, 2015
@sebdah
Copy link
Owner

sebdah commented Sep 30, 2015

This has now been released in version 2.1.0! Thanks again for the PR @Sazpaimon

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants