-
What is the recommended way to throttle s3 reads? I'm currently attempting to fetch objects in parallel by sending all my requests and then invoking In my use-case this could be up to around 2.5k requests. When sending a large number of concurrent requests I see many I've tried playing around with some of the client configuration, for example the below but it doesn't help a lot. let shared_conf = aws_config::from_env()
.region(Region::new("us-east-1"))
.credentials_provider(profile_creds)
.configure(provider_config)
.retry_config(
RetryConfig::new()
.with_max_attempts(10)
.with_retry_mode(RetryMode::Adaptive),
)
.timeout_config(
TimeoutConfig::new()
.with_api_call_timeout(Some(Duration::from_secs(30)))
.with_api_call_attempt_timeout(Some(Duration::from_secs(30)))
.with_connect_timeout(Some(Duration::from_secs(30)))
.with_read_timeout(Some(Duration::from_secs(30)))
.with_tls_negotiation_timeout(Some(Duration::from_secs(30))),
)
.load()
.await; I think what I probably need is a way of limiting the number of concurrent requests or controlling the client pool size? I couldn't work out how to do that but it seems like it should be possible. What am I doing wrong and what is the correct approach to get the maximum throughput but with no loss of data. |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 1 reply
-
We have a relevant example for sending concurrent My key tips for this:
If you're still unsure of how to apply this to your use case, post more information about your use case and I'll advise, assuming I have time. I'm excited to see how this works out for you. |
Beta Was this translation helpful? Give feedback.
-
Hello! Reopening this discussion to make it searchable. |
Beta Was this translation helpful? Give feedback.
We have a relevant example for sending concurrent
PutObject
requests. We also have an integration test that may be helpful.My key tips for this: