New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
@aws-sdk/client-s3 Passing a stream Body to PutObjectCommand will hang if a retry happens #5479
Comments
Hi @jeanbmar, Im not sure about your use case here, but in your repro example you are creating the stream from a buffer. In that case you can supply the buffer directly to the Thanks, |
Hi, This was for the purpose of repro, I don't create the Readable from a buffer in real code. But I'm afraid this issue is a big flaw for any command that uses a Readable body. There's a chance the process will hang and it's very hard for developers to pinpoint the origin of the problem. |
Hi @jeanbmar , Yeah thats what I thought. Unfortunately I was not able to artificially reproduce this. I've tried your approach of iterating over that list over and over but it did not cause the service to throw a slow down 503 error. import { S3Client, PutObjectCommand } from "@aws-sdk/client-s3";
import { Readable } from "stream";
const s3Client = new S3Client();
async function uploadStream(id){
const buffer = Buffer.alloc(10000, 'a');
const stream = Readable.from(buffer);
const input = {
Bucket: "foo-bucket",
Key: `my key${id}`,
Body: stream,
ContentLength: buffer.length,
};
try {
const response = await s3Client.send(new PutObjectCommand(input));
console.log(response.$metadata.httpStatusCode)
} catch (error) {
console.log(error)
}
}
const promises = []
for (let i = 0; i < 20000; i++) {
promises.push(new Promise((resolve, reject) =>{
return uploadStream(i)
}))
}
Promise.all(promises).then(() => {
console.log(`Batch ${i + 1} of ${batches} completed.`);
}); I also tried unplugging my wifi to simulate a connection error, but that didnt work either. The request was not retried. This will need to get a deeper look. Adding to our backlog. |
@RanVaknin sample repro code in smithy-lang/smithy-typescript#1092 @jeanbmar a fix is queued for release sometime next week. We are not going to attempt to buffer the request at this time due to complexity and unpredictable behavior. Any failures submitting a stream will be thrown instead of retried. |
Same problem, and the fix does not fix the bug |
@Mathie01 provide a reproduction code sample. |
This issue has not received a response in 1 week. If you still think there is a problem, please leave a comment to avoid the issue from automatically closing. |
This thread has been automatically locked since there has not been any recent activity after it was closed. Please open a new issue for related bugs and link to relevant comments in this thread. |
Checkboxes for prior research
Describe the bug
I work on a project that needs to upload a few hundred files to S3 a few times per day. This happens in Lambda.
I'm using a single
PutObjectCommand
for each file and pass bodies as Readable streams. TheContentLength
property ofPutObjectCommand
is properly set.I noticed Lambda timeouts on PutObjectCommand
send
calls, that I couldn't explain, happening randomly.I monkey patched
@smithy/node-http-handler
, and figured it happened when I was receiving 503 Slow Down errors from AWS. Fine? No, because the 15s timeout from my Lambda didn't match the retry delay which is between 100-1000ms. It should have been more than enough.After more monkey patching, I found the cause:
When a Readable is passed to PutObjectCommand, the stream will be consumed by the http request through:
https://github.com/awslabs/smithy-typescript/blob/a4b58b32ac2ae778917e276ba381527f551c2d3d/packages/node-http-handler/src/write-request-body.ts#L60C7-L60C7
Now what happens when there's some throttling involved and the request has to be retried? The stream is already consumed and ended. The
end
event from the Readable will never be emitted again. The http request will hang instead of creating a response:https://github.com/awslabs/smithy-typescript/blob/a4b58b32ac2ae778917e276ba381527f551c2d3d/packages/node-http-handler/src/node-http-handler.ts#L148-L156
I'm surprised such a major issue was never noticed. It makes streams impossible to use with PutObjectCommand (and possibly others) as soon as there's some volume involved since S3 sends 503 when scaling.
SDK version number
@aws-sdk/client-s3@3.435.0
Which JavaScript Runtime is this issue in?
Node.js
Details of the browser/Node.js/ReactNative version
v18.17.0
Reproduction Steps
Loop on this until S3 replies with a 503 Slow down error, the
send
promise won't resolve when retrying.Observed Behavior
The
send
promise is hanging.Expected Behavior
The
send
promise should either properly resolve after the retry delay or reject with a message stating than body streams cannot be retried.Possible Solution
Use a PassThrough stream between the body and the http request to copy flowing data in a buffer, if ContentLength is reasonably small. Use the buffer when a retry happens (or throw if ContentLength was too large).
Additional Information/Context
No response
The text was updated successfully, but these errors were encountered: