New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Unexpected provisioned throughput increase #135
Comments
Thanks for the report. However, I would need the following to see why the provisioning is increased:
It looks like you have a lot of throttled write requests, maybe they are the cause? |
(accidentally removed comment)..: @zgrega wrote: Hello, I have also seen this problem. In my case it happened during write peek. Brg Thanks for the report. However, I would need the following to see why the
It looks like you have a lot of throttled write requests, maybe they are |
Interesting @zgrega. I was not expecting throttling to be considered unless explicitly configured. Will look at what might cause this. |
I have located a part in the code that seems to be guilty for this behaviour. The code fix in 50d7443 should make dynamic-dynamodb skip throttling unless it is configured explicitly (with a value higher than 0). A beta version of this is included in version 1.10.2b4. Please install it with the following command and verify if the behaviour still exists.
|
This has now been released in version 1.10.2. Thanks for the bug report! |
After upgrade I can not start daemon! Traceback (most recent call last): |
Thanks @OXYAMINE, that is fixed in 1.10.3. A problem with parsing default options for GSIs was behind that. |
thank you |
I can see unexpected provisioned throughput increase performed by dynamic-dynamoDB script. Looking at this behavior second day.
There is a table configured like this:
And there is this in logs:
The question is why is it increases the limit if consumption is much, much less than writes-upper-threshold and continues to decrease?
Actually this cost us a lot of money and makes to think how reliable dynamic-dynamoDB is.
The text was updated successfully, but these errors were encountered: