Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[CRITICAL] [lib-storage] Uploading in Safari is in a non-working state #2365

Open
ffxsam opened this issue May 9, 2021 · 34 comments
Open

[CRITICAL] [lib-storage] Uploading in Safari is in a non-working state #2365

ffxsam opened this issue May 9, 2021 · 34 comments
Labels
dependencies This issue is a problem in a dependency. p3 This is a minor priority issue workaround-available This issue has a work around available.

Comments

@ffxsam
Copy link

ffxsam commented May 9, 2021

Describe the bug

When uploading from Safari directly to S3 (credentials supplied via Cognito), I get this error:

Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob

I'm passing a File object like so:

const upload = new Upload({
  client: s3,
  params: {
    Bucket: process.env.VUE_APP_UPLOAD_BUCKET,
    Body: file,
    ContentType: file.type,
    Key: destKey,
    Metadata: metadata,
  }
});

Your environment

SDK version number

@aws-sdk/lib-storage@3.14.0

Is the issue in the browser/Node.js/ReactNative?

Browser

Details of the browser/Node.js/ReactNative version

Safari 14.0.2 and under

Steps to reproduce

(see code snippet above)

Observed behavior

Uploading a file using the Upload class fails with error:
Body Data is unsupported format, expected data to be one of: string | Uint8Array | Buffer | Readable | ReadableStream | Blob

Expected behavior

I would expect Webkit/Safari to follow W3C standards better. 😉 Also, uploads worked totally fine in SDK v2.

Additional context

Pour yourself a coffee and pull up a chair...

So, lib-storage is checking for the existence of data.stream as a function, which is totally legit:

https://github.com/aws/aws-sdk-js-v3/blob/main/lib/lib-storage/src/chunker.ts#L19-L21

File, inheriting from Blob, should implement that. Unfortunately, in Safari, Blob.prototype.stream() does not exist, even though this is a W3C standard drafted way back in May 2019. This wasn't supported in Safari until 14.1.

I'm working around this with the following code, but it's not ideal, as it creates a slight lag for the user as very large files (potentially 70-150MB) are converted to Uint8Array before passing to the Upload constructor:

const fileBuffer = await file.arrayBuffer();
const uploadParams: PutObjectCommandInput = {
  Bucket: process.env.VUE_APP_UPLOAD_BUCKET,
  Body: new Uint8Array(fileBuffer),
  ContentType: file.type,
  Key: destKey,
  Metadata: metadata,
};
const upload = new AWSUpload({
  client: s3,
  params: uploadParams,
});

I don't think there's any ideal workaround at this point. But maybe if this library could handle this scenario instead of putting that on the developer (so they don't wind up wasting time like I did), that would be helpful. Maybe the getChunk function could check for data being an instance of Blob but with data.stream returning undefined, and if so, then call data.arrayBuffer() to parse the data and return it.

If there's a better way I can handle this, I'm open to hearing about it! But if Safari doesn't support Blob streaming, it seems like reading the entire file into memory is the only alternative at this point.

@ffxsam ffxsam added bug This issue is a bug. needs-triage This issue or PR still needs to be triaged. labels May 9, 2021
@ffxsam
Copy link
Author

ffxsam commented May 9, 2021

Just had a thought:

The v2 AWS SDK didn't have this issue. So I'm not sure what they're doing differently.

@ffxsam
Copy link
Author

ffxsam commented May 9, 2021

UPDATE (related issue).

Uploading using this library and Safari 14.1 also breaks (when using the stream) method. The page auto reloads, and I get a strange message at the top of the Safari window: "The webpage was reloaded because a problem occurred." When I read the full arraybuffer into memory and pass a Uint8Array instead, it works fine. So this is something related to Blob#stream again.. maybe.

Here's a reproduction of this upload issue: https://github.com/ffxsam/repro-safari-upload

Clone, run yarn, then yarn serve. You'll obviously need to fill in the proper credentials and S3 bucket name.

@ffxsam ffxsam changed the title [lib-storage] Cannot upload files from Safari directly to S3 like other browsers [CRITICAL] [lib-storage] Uploading in Safari is in a non-working state May 9, 2021
@ffxsam
Copy link
Author

ffxsam commented May 9, 2021

@AllanZhengYP I think this could be considered critical. Anyone using the v3 SDK with Safari is going to run into this issue, I would imagine.

@ffxsam
Copy link
Author

ffxsam commented May 20, 2021

This is becoming more of an issue with our customers. I'd definitely appreciate some help on this! @trivikr @AllanZhengYP @jeskew

@ajredniwja ajredniwja added the investigating Issue is being investigated and/or work is in progress to resolve the issue. label May 24, 2021
@ajredniwja
Copy link
Contributor

ajredniwja commented May 24, 2021

@ffxsam apologies for the late reply, I was able to reproduce this previously. Need to investigate more to find an appropriate solution. You are right about that Blob.prototype.stream() missing in safari which originally causes the issue.

@ajredniwja ajredniwja added this to In progress in Investigation May 24, 2021
@ffxsam
Copy link
Author

ffxsam commented May 24, 2021

@ajredniwja Thanks for the response! It looks like Safari 14.1 implements Blob.prototype.stream, but this SDK's use of it causes Safari's upload process to crash and reload the page with the "The webpage was reloaded because a problem occurred" error.

For now, we've added in the old aws-sdk v2 to use S3.upload.

@ajredniwja
Copy link
Contributor

Its unfortunate that you have to downgrade for workaround, I will discuss it with the team and see what a potential solution could be for this.

@ffxsam
Copy link
Author

ffxsam commented May 24, 2021

Thanks, Ajwinder!

@ajredniwja
Copy link
Contributor

Hey @ffxsam I was able to run some tests, #2183 (comment) used this code which didn't work in previous version but works in 14.1.1

@ffxsam
Copy link
Author

ffxsam commented May 25, 2021

@ajredniwja I don't think that solution will work for me:

const blob = await response.blob();

This looks like it loads the entire blob into memory. We have customers uploading multiple files that could range from 20MB to 100MB. I was already reading the whole file into memory as a workaround, and it bombed some users' browsers.

We really do need streaming to work like it did with aws-sdk's upload method.

@ajredniwja
Copy link
Contributor

I understand, let me try and work with the example you provided.

@ajredniwja
Copy link
Contributor

@ffxsam, getting CORS error for some reason trying to work with the code you provided. I was going to look into pollyfill for previous versions.

@ffxsam
Copy link
Author

ffxsam commented Jun 3, 2021

@ajredniwja Very odd, not sure why you'd get a CORS error. Since the code isn't doing anything that would require any sort of CORS setup, the only thing I can think of is that your destination S3 bucket doesn't have the proper CORS permissions.

@ajredniwja
Copy link
Contributor

I was suspecting the same @ffxsam, but I use the same bucket for my React testing, I didnt dig deep, but will definitely try to come up with similar vue example, since the react code worked on safari later version.

@ffxsam
Copy link
Author

ffxsam commented Jun 22, 2021

@ajredniwja Hey Ajwinder, just wanted to check in on this and see if there's been any progress. I'm ok with using the older aws-sdk to handle file uploads for now, but would definitely love to remove that dependency from my app to slim things down.

@ajredniwja
Copy link
Contributor

Hey @ffxsam apologies for late reply, I wasnt working for a while but I tried instantiating a new app:

I was able to successfully upload stuff on S3 without any problems.

I used the Vue CLI to instantiate an app:
vue create my-app

My Package.json: Copied from react project

{
  "name": "hello-world",
  "version": "0.1.0",
  "private": true,
  "scripts": {
    "serve": "vue-cli-service serve",
    "build": "vue-cli-service build",
    "lint": "vue-cli-service lint"
  },
  "dependencies": {
    "core-js": "^3.6.5",
    "vue": "^3.0.0"
  },
  "devDependencies": {
    "@vue/cli-plugin-babel": "~4.5.0",
    "@vue/cli-plugin-eslint": "~4.5.0",
    "@vue/cli-service": "~4.5.0",
    "@vue/compiler-sfc": "^3.0.0",
    "babel-eslint": "^10.1.0",
    "eslint": "^6.7.2",
    "eslint-plugin-vue": "^7.0.0",
    "@aws-sdk/client-cognito-identity": "^3.15.0",
    "@aws-sdk/client-lex-runtime-service": "^3.15.0",
    "@aws-sdk/s3-request-presigner": "^3.16.0",
    "@aws-sdk/client-s3": "^3.16.0",
    "@aws-sdk/client-sqs": "^3.15.0",
    "@aws-sdk/credential-provider-cognito-identity": "^3.7.0",
    "@aws-sdk/lib-storage": "3.7.0",
    "@testing-library/jest-dom": "^5.11.4",
    "@testing-library/react": "^11.1.0",
    "@testing-library/user-event": "^12.1.10",
    "fs": "0.0.1-security",
    "react": "^17.0.1",
    "react-dom": "^17.0.1",
    "react-scripts": "4.0.3",
    "web-vitals": "^1.0.1"
  },
  "eslintConfig": {
    "root": true,
    "env": {
      "node": true
    },
    "extends": [
      "plugin:vue/vue3-essential",
      "eslint:recommended"
    ],
    "parserOptions": {
      "parser": "babel-eslint"
    },
    "rules": {}
  },
  "browser": {
    "crypto": false,
    "fs": false,
    "path": false,
    "os": false,
    "net": false,
    "stream": false,
    "tls": false
  },
  "browserslist": [
    "> 1%",
    "last 2 versions",
    "not dead"
  ]
}

Code with function to upload:

import { fromCognitoIdentityPool } from "@aws-sdk/credential-provider-cognito-identity";
import { CognitoIdentityClient } from "@aws-sdk/client-cognito-identity";
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";

const idPool = "us-west-2khjsdakj";
const Bucket = 'aj-teest';
const region = 'us-west-2';

export const s3Test = async () => {
    // debugger;

     const creds = fromCognitoIdentityPool({
         client: new CognitoIdentityClient({ region }),
          identityPoolId: idPool,
     });

    const client = new S3Client({
        region,
        credentials: creds
    });

    const Key = `${Date.now()}-new-key`;
    let upload = new Upload({
        client,
        tags: [{
            Key: "my-tag-key",
            Value: "my-tag-value"
        }],
        params: { Key, Bucket,
            Body: "hello world!"
        }
    }); 

    upload.on("httpUploadProgress", (progress ) => {
        console.log(progress);
    });

    const uploadResult = await upload.done();
    
    // debugger;
    console.log("done uploadingg", uploadResult);

}

@ffxsam
Copy link
Author

ffxsam commented Jun 30, 2021

@ajredniwja As I mentioned before, this is an issue with larger files. Create a 50MB file and use that as your Body parameter. (take the change event of an <input type="file"> and pass the first file in the FileList as the Body)

I tested just now in Safari, and the page still bombs with a message at the top: "This webpage was reloaded because a problem occurred."

You can use my repro I linked to awhile back to test: https://github.com/ffxsam/repro-safari-upload/tree/main/src

@alexforsyth alexforsyth removed the needs-triage This issue or PR still needs to be triaged. label Jul 8, 2021
@ajredniwja
Copy link
Contributor

@ffxsam, I used your repo, safari had some caching issue that it wont pull the latest packages but thats another issue. I saw series of issues but the weird part is that was only with Vue.js framework, I can look into what pollyfills they use as they mention they do it automatically using babel.

@ffxsam
Copy link
Author

ffxsam commented Jul 14, 2021

@ajredniwja Wow, good find! Thank you for taking so much time to help with this. I'm going to file an issue with Vue and link to this issue, hopefully we can discover a solution.

@AllanZhengYP AllanZhengYP added dependencies This issue is a problem in a dependency. and removed investigating Issue is being investigated and/or work is in progress to resolve the issue. bug This issue is a bug. labels Sep 2, 2021
@AllanZhengYP AllanZhengYP added the workaround-available This issue has a work around available. label Sep 2, 2021
@ludwighen
Copy link

@ffxsam I have the same issue with safari crashing and reloading the web page. We're however using a React app.

@ffxsam
Copy link
Author

ffxsam commented Sep 22, 2021

@ludwighen Ah! Do you think you could put together a small reproduction repo?

@ludwighen
Copy link

Also weird: Files smaller than 500KB work fine. Larger files fail.
I'll see if I can put something together later but I am really using a basic implementation:

const s3 = new S3Client({
    region: 'eu-central-1',
    credentials: fromCognitoIdentityPool({
        client: new CognitoIdentityClient({ region: 'eu-central-1' }),
        identityPoolId: COGNITO_IDENTITY_POOL_ID_EU_CENTRAL
    })
});

const paralellUploads3 = new Upload({
            client: s3,
            queueSize: 1,
            leavePartsOnError: false,
            params: uploadParams
        });

paralellUploads3.on('httpUploadProgress', progress => {
    const percent = Math.round(
        (progress.loaded / progress.total) * 100
    );
    onProgress({ percent });
});

await paralellUploads3.done();

@ffxsam
Copy link
Author

ffxsam commented Sep 24, 2021

@ajredniwja Seems like this issue could be due to a conflict with the AWS SDK and some other package, maybe?

@ludwighen Could you paste your package.json here so we can see your dependencies?

@ffxsam
Copy link
Author

ffxsam commented Sep 30, 2021

The level of attention this issue is getting is kind of disappointing. It's been months now, and this SDK still causes Safari to bomb when large files are uploaded.

@ajredniwja
Copy link
Contributor

@ffxsam, The team is actively looking into it, I will have an update for you soon.

@ffxsam
Copy link
Author

ffxsam commented Oct 28, 2021

@ajredniwja Any updates? This has been an issue since May. How large is the team that maintains this SDK? This feature is in an unusable state, and it's frustrating to see this issue not getting the priority I think it deserves.

@ludwighen Just pinging you again. If you share your package.json file, it would help us nail down this issue.

@ahthomsen
Copy link

Any updates on this one? We are facing the same issue

@ffxsam
Copy link
Author

ffxsam commented Feb 23, 2022

Pinging @trivikr to take a look.

@kuhe kuhe assigned kuhe and unassigned kuhe Jul 7, 2022
@pedro-pedrosa
Copy link

I'm running into issues in safari as well, but my main problem is with stream() itself creating a blob behind the scenes, which fails for most of my users who are trying to upload large video files greater than 2gb.

for anyone interested, I basically reverted using patch-package to the behavior that v2 used to do, which is to call slice() instead of stream(). this should probably solve any other safari issues.

--- a/node_modules/@aws-sdk/lib-storage/dist-es/chunker.js
+++ b/node_modules/@aws-sdk/lib-storage/dist-es/chunker.js
@@ -14,7 +14,10 @@ export const getChunk = (data, partSize) => {
     else if (data instanceof String || typeof data === "string" || data instanceof Uint8Array) {
         return getChunkBuffer(Buffer.from(data), partSize);
     }
-    if (typeof data.stream === "function") {
+    if (typeof data.slice === "function") {
+        return getChunkBuffer(data, partSize);
+    }
+    else if (typeof data.stream === "function") {
         return getChunkStream(data.stream(), partSize, getDataReadableStream);
     }
     else if (data instanceof ReadableStream) {

@marcjordan
Copy link

marcjordan commented Oct 23, 2023

This issue has a "workaround-available" tag. Is the workaround to downgrade to the v2 SDK? Or is the workaround to use patch-package to modify the code?

@glebbars
Copy link

I have the same problem on Safari when uploading files over 4Gb on MacBook M2 Pro and 1Gb on iPhone 14.

I am using multi-part upload on the FrontEnd and @aws-sdk/client-s3: ^3.418.0, @aws-sdk/lib-storage: 3.501.0.

The error says Out of memory which occurs only on aws-sdk v3 but not v2.

Is there a solution for aws-sdk v3?

@christopher-ha
Copy link

Following this ...

@nejcskofic
Copy link

I have contacted AWS support over this issue. There were some promises they will fix this, but there has been radio silence for two months now and support request was closed without resolution. So make of that what you will.

My solution was to take lib-storage source code and modify problematic portion to behave the same as in v2. Patch package would probably also work, but since it is relatively simple library, I decided against it to keep setup simple.

@ffxsam
Copy link
Author

ffxsam commented Apr 11, 2024

FYI, this shouldn't be an issue at this point. I'd recommend people use the Amplify JS client for uploads.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies This issue is a problem in a dependency. p3 This is a minor priority issue workaround-available This issue has a work around available.
Projects
Investigation
In progress
Development

No branches or pull requests