Throttled GSI settings ignored #198

Closed
ulsa opened this Issue Jul 28, 2014 · 6 comments

Projects

None yet

2 participants

@ulsa
Contributor
ulsa commented Jul 28, 2014

Throttled writes limit for GSI set to 40:

[gsi: ^.+_idx$ table: ^dev-.+DynamoDBTable-.+$]
...
throttled-writes-upper-threshold: 40

Actual throttled writes are -- according to the CloudWatch graphs -- 400-600. Still no scaling occurs.

Versions:

$ pip show boto

---
Name: boto
Version: 2.31.1
Location: /usr/local/lib/python2.7/dist-packages
Requires: 

$ dynamic-dynamodb --version
Dynamic DynamoDB version: 1.18.2

Logs:

2014-07-28 09:32:25,590 - dynamic-dynamodb - DEBUG - Table dev-AppStatusDynamoDBTable-NT88H7LV4LHS GSI exists_idx matches GSI config key ^.+_idx$
2014-07-28 09:32:25,590 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - Will ensure provisioning for global secondary index exists_idx
...
2014-07-28 09:32:25,812 - dynamic-dynamodb - DEBUG - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Currently provisioned write units: 12
2014-07-28 09:32:25,812 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Consumed write units: 0%
2014-07-28 09:32:25,835 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Write throttle count: 600
2014-07-28 09:32:25,835 - dynamic-dynamodb - DEBUG - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Setting min provisioned writes to 5
2014-07-28 09:32:25,835 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Reached provisioned writes min limit: 5
2014-07-28 09:32:25,835 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Consecutive write checks 1/10
2014-07-28 09:32:25,835 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - No need to change provisioning
@ulsa
Contributor
ulsa commented Jul 28, 2014

Went back to 1.17.0 and the problem went away.

@sebdah
Owner
sebdah commented Jul 30, 2014

I had a quick look yesterday trying to see the difference for related code blocks between the mentioned versions, but I could not see anything that should affect this. But I need to dig a bit more into it.

Just one question though, your logs says:

2014-07-28 09:32:25,835 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Write throttle count: 600

So that value is what you expect? But dynamic-dynamodb does not scale up anyway? I mean, the problem is not that the value from CloudWatch is wrong, but rather the action (not) taken on that value, right?

@sebdah sebdah added the type: bug label Jul 30, 2014
@sebdah sebdah added this to the 1.18.x milestone Jul 30, 2014
@sebdah sebdah self-assigned this Jul 30, 2014
@ulsa
Contributor
ulsa commented Jul 30, 2014

The value 600 is what I see in the CloudWatch graphs as well, so I can only assume that is correct. The action scale up writes is what I'm missing, since the limit for the index is 40 throttled writes.

@sebdah
Owner
sebdah commented Jul 30, 2014

Thanks.

I might have found something interesting. It looks like when throttling wanted to scale up, but the regular metrics indicated a scale down, the latter would take precedence.

I have released an alpha version 1.18.3a1 to PyPI. Can you please test the use case with that version?

@ulsa
Contributor
ulsa commented Jul 30, 2014

A quick test with 1.18.3a1 shows that it does scale up on throttled writes:

2014-07-30 21:27:04,515 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Consumed write units: 0%
2014-07-30 21:27:04,537 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Write throttle count: 270
2014-07-30 21:27:04,537 - dynamic-dynamodb - INFO - dev-AppStatusDynamoDBTable-NT88H7LV4LHS - GSI: exists_idx - Resetting the number of consecutive write checks. Reason: scale up event detected
@sebdah
Owner
sebdah commented Aug 4, 2014

Thank you very much and sorry for the long feedback loops. I'm just back from my vacation and I have now released 1.18.3 including fixes for this.

@sebdah sebdah closed this Aug 4, 2014
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment