-
Notifications
You must be signed in to change notification settings - Fork 351
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Upload session example #80
Comments
|
Thanks for the request! We don't have a sample of using upload sessions in this JavaScript SDK, but I'll be sure to pass this along as a request for one. |
|
Just one more thing. When I am uploading an image using Thanks! |
|
The |
|
Thanks! I got it wrong. I just completely missed the encoding while reading the file. Was using utf-8. |
|
+1 For this example being added - I can't find a single example of how to use upload sessions! Been struggling with this for a couple hours |
|
It would also be nice to include a batch upload example, as the documentation is a bit lacking and confusing. |
|
I would like to see an example of monitoring upload progress. The previous SDK had an onXhr member where I could call clientObj.onXhr.addListener() and then use xhr.upload.addEventListener(). Is there access to the XMLHttpRequest object in this new SDK? |
|
Thanks for the requests all! @kennellink I don't believe this SDK exposes that, but I'll be sure to pass this along as a feature request. |
|
@greg-db Any update on this? Example of using sessions and/or batch upload would be really helpful as the docs are a bit sparse about this (or it's just me that finds understanding code more easy than api docs). |
|
Okay, to be more specific… I read the docs here: https://www.dropbox.com/developers/reference/data-ingress-guide I struggle with the third step since I don't know what the argument for here's my example code so far with the missing piece commented |
|
Hi @florianbepunkt I don't have an update on a sample for this right now unfortunately. As far as the documentation is concerned though, If you're using Here's a contrived example that should make this clearer: ... ... |
|
Hi @greg-db , thank you for the example. In your example the sequential execution of the promises is not guaranteed, right? So basically testFile2 could finish the uploaded before testFile1. Is the order of the entries in dbx.filesUploadSessionFinishBatch not relevant? Let's say, testFile2 is uploaded before testFile1 but the entries array order is testFile1, testFile2. |
|
That's correct, in my sample the For the sake of making the example more readable, I separated the different pieces. In a real app, you'd need to make sure you have all of the You can give the |
|
Another request for a working example to be added. It would be very helpful. |
|
@greg-db Thanks for the example, but: |
|
Hi @alesmenzel : A) There isn't a particular recommended chunk size, as the optimal value will depend on various factors, such as the network connection itself. A bigger size is more efficient in general, but it increases the risk of any particular request failing. And if a request fails, you need to re-upload all of the data for that request again. A smaller size reduces how much you would need to re-upload for any failure, but it also increases the overhead due to the increased number of HTTPS requests required. That all said, a 8 MB is probably a good place to start. B) Using a 64KB chunk size is generally not recommended, as you need to open and close an HTTPS connection very often, increasing the overhead. Even so, 5 seconds still sounds longer than expected. I just tried it myself and it only took around .6 seconds though, so it sounds like the network connection is likely a factor in this case. |
|
@greg-db Summary: Repository: https://github.com/AlesMenzel/dropbox-session-test IMPORTANT prerequisites:
|
|
@alesmenzel Have you tried with a larger chunk size, e.g., 8 MB? (You last mentioned 64 KB.) In any case, this is now getting off topic for this issue, so if you're still seeing very slow performance for these calls, please open a new issue with this information. (That way we don't keep spamming everyone else here.) Thanks! |
|
Yes, the repo is sending 8MB chunks which takes ~45s to upload. |
|
Moved the issue to a separate thread at #141. |
|
Need some help running this code in AWS Lambda function. Can any on help me please? |
|
@anishtimila It sounds like this isn't an issue with the Dropbox library, so I'm afraid I can't be of help. Perhaps someone else here happens to know. Otherwise, you may be better aided on another forum, e.g., for AWS or node in particular. There's a post on StackOverflow that may help though. |
|
The @greg-db sample code looks right to me. If you want to see some working code, our app includes an upload function that handles single or multiple files, and does multi-part (session-based) uploads on files larger than a specified max chunk size. It does these operations in serial, allowing you to show progress and cancel if desired. If interested, take a look at: https://github.com/SynchroLabs/CloudStashWeb/blob/f828bec5d81e3dd5b6784f359cf4bb75d82eb89a/public/script.js#L186-L376 |
|
Another request for an upload session example - would also love for es6 support! |
|
An example would be great too. I've started just today using the DB API today and can do files < 150mb fine... however I'm building a website for a recording studio with 800mb+ files. Will post results if I succeed. |
|
Hi, |
|
Hi, |
|
@madept This issue is for the JavaScript SDK, not PHP. It looks like you also posted on the forum for this though, so we'll follow up there: https://www.dropboxforum.com/t5/API-support/API-V2-Upload-Limit/m-p/272062#M16150 |
|
The issue should be closed |
|
Thanks for the reminder! Yes, there is a sample for this now. |
|
here's a solution if you need to stream the file async function uploadStream(filePath, dropboxFilePath) {
console.log(`upload stream file '${filePath}' to '${dropboxFilePath}'`)
const fileChunker = new FileChunker({filePath, chunkSize: 10000000})
let contents = fileChunker.getNextChunk()
const fileUploadSession = await dropbox.filesUploadSessionStart({contents, close: false})
let response
while (true) {
contents = fileChunker.getNextChunk()
const offset = fileChunker.getLastChunkOffset()
if (fileChunker.isFinished) {
response = await dropbox.filesUploadSessionFinish({
contents,
cursor: {session_id: fileUploadSession.session_id, offset},
commit: {path: dropboxFilePath, mode: 'overwrite'}
})
break
}
await dropbox.filesUploadSessionAppend({
contents, offset,
session_id: fileUploadSession.session_id
})
console.log(`uploaded ${prettyBytes(offset)}`)
}
console.log('upload stream finished')
console.log(response)
return response
}
const fs = require('fs')
const assert = require('assert')
class FileChunker {
constructor ({filePath, chunkSize}) {
assert(typeof chunkSize === 'number', `chunkSize ${chunkSize} is not a number`)
assert(typeof filePath === 'string', `filePath ${filePath} is not a string`)
this.filePath = filePath
this.chunkSize = chunkSize
this.fileDescriptor = fs.openSync(filePath)
this.fileSize = fs.statSync(filePath).size
this.nextChunkNumber = 0
this.isFinished = false
}
getChunk (chunkNumber) {
assert(typeof chunkNumber === 'number', `chunk number ${chunkNumber} is not a number`)
const chunk = Buffer.alloc(this.chunkSize)
const offset = 0
const length = this.chunkSize
const position = this.chunkSize * chunkNumber
const bytesRead = fs.readSync(this.fileDescriptor, chunk, offset, length, position)
if (bytesRead !== this.chunkSize) {
this.isFinished = true
return chunk.slice(0, bytesRead)
}
return chunk
}
getNextChunk() {
return this.getChunk(this.nextChunkNumber++)
}
getLastChunkOffset() {
if (this.nextChunkNumber === 0) {
return 0
}
const lastChunkOffset = (this.nextChunkNumber - 1) * this.chunkSize
if (lastChunkOffset > this.fileSize) {
return this.fileSize
}
return lastChunkOffset
}
}
const prettyBytes = (bytes) => {
var sizes = ['Bytes', 'KB', 'MB', 'GB', 'TB']
if (bytes == 0) return '0 Byte'
var i = parseInt(Math.floor(Math.log(bytes) / Math.log(1024)))
return Math.round(bytes / Math.pow(1024, i), 2) + ' ' + sizes[i]
} |
For the file size greater than 150 MB we use the upload session. But any example for the same would be great because I am not able to implement it correctly after so many trials.
Thanks!
The text was updated successfully, but these errors were encountered: