Cannot retrieve getResumableFilesData() and cannot use retry() #1425

Closed
TimDaub opened this Issue Jun 12, 2015 · 7 comments

Projects

None yet

2 participants

@TimDaub
TimDaub commented Jun 12, 2015

Hi there,

yesterday we started using fineuploader (Version: 5.2.2).
We're using Chromium 41.0.2272.76 on Ubuntu 15.04 (64-bit).
Our essential goal was to wrap the S3 (direct multipart upload to S3) inside of a react.js component.
This is our configuration:

{
   "autoUpload":true,
   "debug":false,
   "objectProperties":{
      "acl":"public-read",
      "bucket":"<ourBucket>"
   },
   "request":{
      "endpoint":"https://<ourBucket>.s3.amazonaws.com",
      "accessKey":"<myAmazonAccessKey>"
   },
   "signature":{
      "endpoint":"<our signature endpoint url>"
   },
   "uploadSuccess":{
      "params":{
         "isBrowserPreviewCapable":true
      }
   },
   "cors":{
      "expected":true
   },
   "chunking":{
      "enabled":true
   },
   "resume":{
      "enabled":true
   },
   "deleteFile":{
      "enabled":true,
      "method":"DELETE",
      "endpoint":"<our delete endpoint>"
   },
   "session":{
      "endpoint":null
   },
   "validation":{
      "itemLimit":1,
      "sizeLimit":"25000000000"
   },
   "messages":{
      "unsupportedBrowser":"<h3>Upload is not functional in IE7 as IE7 has no support for CORS!</h3>"
   },
   "multiple":true,
   "retry":{
      "enableAuto":true
   },
   "callbacks":{
     // our callback functions, JSON.stringify swallowed them
   }
}

We're using the basic version of fineuploader.
The upload of a file works fine, though when we initialize new fineUploader.s3.FineUploaderBasic(config) and add files via addFiles(files), we can only submit a list (FileList) that contains one File.
This forces us to listen to the onComplete callback method and re-initialize FineUploaderBasic every time to call addFiles again.

Though, the real problem emerges when we try to implement resumable/session-independent uploads.
We already noticed that when uploading a large file with fineuploader, it splits it into chunks and after finishing the first chunk, the progress data is saved in the browsers local storage.
Here is an example that shows, that this is working for us:

Key: qqs3resume5.0-skype-ubuntu-precise_4.3.0.37-1_i386.deb-20112698-5242880-https://ascribe0.s3.amazonaws.com
Value:

{
   "name":"skype-ubuntu-precise_4.3.0.37-1_i386.deb",
   "size":20112698,
   "uuid":"dd0580c5-9d4c-4b71-9260-5664e72db404",
   "key":"<our S3 bucket path>",
   "chunking":{
      "enabled":true,
      "parts":4,
      "remaining":[
         1,
         2,
         3
      ],
      "inProgress":[

      ],
      "uploadId":"bLrrana26qJ_fIGejD6oCAkK392nIq.E4.Sah4bT2q51IDZ9ZGovb3w3IvshxnLPRZzBxqH4tRPVgwsOCPiwXc6ZJaaP.Xasf_P9iMs03VA-",
      "etags":[
         {
            "part":1,
            "etag":"\"aa1bae573170c8396c44d6e72c79c53d\""
         }
      ]
   },
   "loaded":5242880,
   "lastUpdated":1434092821127
}

However, we can not retrieve this information using getResumableFilesData() since we're only getting an empty array back.
So we looked into the source code and found something interesting:

_iterateResumeRecords: function(callback) {
            if (resumeEnabled) {
                qq.each(localStorage, function(key, item) {
                    if (key.indexOf(qq.format("qq{}resume-", namespace)) === 0) {
                        var uploadData = JSON.parse(item);
                        callback(key, uploadData);
                    }
                });
            }
        },

The if statement actually only matches local storage keys that start with qqs3resume-, though our fineuploader version stores keys with the schema qqs3resume5.0-.
Are we doing something wrong or is this a bug in fineuploader?

Another thing we tried to do was to set the retry option to:

{
    enableAuto: true
}

Though, when listening to onAutoRetry, onRetry, onManualRetry we don't get any messages that would proof that retry.enableAuto = true is working.

Any suggestions on what we're doing wrong?

@rnicholus
Member

It sounds like you are reporting three unrelated issues in the same case. If this is true, please split them up.

Your first issue (I think) - only able to submit one file - is expected as you are setting the fileLimit option to 1.

I'll have to investigate the getResumableFilesData() issue a bit.

Regarding the callback issue or auto retry issue you are reporting (not sure exactly what issue you are having here) - retrying and the associated callbacks definitely work without any issues that I am aware of. If you are seeing something different, you'll need to provide specific reproduction steps and a clear description of the issue.

@TimDaub
TimDaub commented Jun 12, 2015

Ok, yeah totally missed the fileLimit option. Now multiple uploads are working fine.

I think the main problem is essentially that getResumableFilesData() is not retriving results from the browsers local storage.

Therefore - of course - nothing can be autoRetired as for the library, there is nothing.

If I can help you with more, then please reach out to me here and I'll try to answer all your questions.

@rnicholus
Member

Thanks. I'll look into this further and will update when I have a question or resolution.

@TimDaub
TimDaub commented Jun 15, 2015

Good morning,

@rnicholus Any updates on the issue yet?

@rnicholus
Member

Sorry, haven't had a chance to look into this yet. I should be able to get
to this on Thursday.
On Mon, Jun 15, 2015 at 1:32 AM Tim Daubenschütz notifications@github.com
wrote:

Good morning,

@rnicholus https://github.com/rnicholus Any updates on the issue yet?


Reply to this email directly or view it on GitHub
#1425 (comment)
.

@rnicholus
Member

I found the issue with getResumableFilesData(). Indeed it is not returning any results, but this issue is localized to the logic associated with looking up resumable file records for the user-facing API method and deleting them after they have expired. The resume feature itself works and is not affected as far as I can tell.

If you are having retry or resume issues, the problem is elsewhere, perhaps specific to your environment or integration code. There is a bug in the getResumableFilesData public accessor method though, and this will be fixed in the next release, perhaps even a hotfix release. The issue with this API method is caused by an error in the internal _iterateResumeRecords method of the xhr.upload.handler.js file. The start of the localStorage key does not end with a '-', which is resulting in zero records. Two things call this internal method - the API method that returns resume records, and the code that attempts to remove all old records from localStorage.

@rnicholus rnicholus added this to the 5.2.3 milestone Jun 18, 2015
@rnicholus rnicholus added 3 - Doing and removed 2 - Do labels Jun 24, 2015
@rnicholus rnicholus added a commit that referenced this issue Jun 24, 2015
@rnicholus rnicholus chore(build): inc build num 663ca1b
@rnicholus
Member

This fix is currently staged in the hotfix/5.2.3 branch - version 5.2.3-1.

@rnicholus rnicholus added 5 - Done and removed 3 - Doing labels Jun 24, 2015
@rnicholus rnicholus modified the milestone: 5.2.3, 5.3.0 Jul 6, 2015
@rnicholus rnicholus closed this in b7dbf28 Aug 3, 2015
@rnicholus rnicholus removed the 5 - Done label Aug 5, 2015
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment