Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Handle rate limiting errors #2

Closed
iconara opened this issue Jan 11, 2017 · 1 comment
Closed

Handle rate limiting errors #2

iconara opened this issue Jan 11, 2017 · 1 comment

Comments

@iconara
Copy link
Collaborator

iconara commented Jan 11, 2017

I think AWS rate limits on an account basis, so it's sometimes very easy to run into rate limiting errors.

Here's an example (unfortunately it didn't print the class name of the error):

Rate exceeded
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/seahorse/client/plugins/raise_response_errors.rb:15:in `call'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/aws-sdk-core/plugins/idempotency_token.rb:18:in `call'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/aws-sdk-core/plugins/param_converter.rb:20:in `call'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/aws-sdk-core/plugins/response_paging.rb:26:in `call'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/seahorse/client/plugins/response_target.rb:21:in `call'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/seahorse/client/request.rb:70:in `send_request'
	path/to/cuffsert/vendor/bundle/ruby/2.0.0/gems/aws-sdk-core-2.6.42/lib/seahorse/client/base.rb:207:in `block (2 levels) in define_operation_methods'
	path/to/cuffsert/lib/cuffsert/rxcfclient.rb:87:in `block in stack_events'
	…
@bittrance
Copy link
Collaborator

According to aws/aws-sdk-ruby#705 there is a Seahorse retrying plugin installed, but it might be a bit timid. Suggests:

cf = Aws::CloudFormation::Resource.new({
  retry_limit: 10,
  retry_backoff: Proc.new { |attemps| sleep(2 ** attempts) }
})

Default retry_limit is 3 and calculated with Kernel.sleep(2 ** c.retries * 0.3) } so too short to be useful for throttling use cases.

In order to verify this, request logging is needed. According to https://aws.amazon.com/blogs/developer/logging-requests/ logging can be turned on thus:

require 'logger'
AWS.config(:logger => Logger.new($stdout))

Will give it a try to see if it makes any difference.

bittrance added a commit that referenced this issue Apr 15, 2024
Large string changes are now presented as diffs
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants