Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Problem in very large file upload #1289

Closed
ascaler opened this issue Jan 4, 2014 · 9 comments
Closed

Problem in very large file upload #1289

ascaler opened this issue Jan 4, 2014 · 9 comments
Assignees
Labels
support Questions, discussions, and general support
Milestone

Comments

@ascaler
Copy link

ascaler commented Jan 4, 2014

Most of the time when upload size is under 1M it upload correctly and we get all the data in request.req but if we try to upload 100MB file with mode: stream and multipart : stream , in that case it only give 65K data and can't find a way to get more data. Is that a know issue or is there any trick to get remaining data? Any help in this regard is highly appreciated.

@hueniverse
Copy link
Contributor

Did you change server config payload.maxBytes?

https://github.com/spumko/hapi/blob/master/docs/Reference.md#server.config.payload

@ascaler
Copy link
Author

ascaler commented Jan 4, 2014

Thanks for the quick reply. Yes i did setup the large maxBytes and there are two issues in doing that. We wanted to setup for 5GB size for upload but that doesn't work as it seems it allocate large memory. If I setup like 10MB then from client side I can see that we are sending about stream with expect continue and as long as server keep sending response, client can send more data in chunk of large data size. But in this case the buffer still show only 64KB of data. I was expecting some kind of stream implementation or some kind of event that says receive this data and more is in queue till client sent all the data. Our goal is to receive very large file as S3 does using multipart upload.

@hueniverse
Copy link
Contributor

Sorry, maxBytes is not relevant here as you are not setting payload to raw or parse. It has no effect for payload mode stream. Can you show me what your route config looks like?

@ascaler
Copy link
Author

ascaler commented Jan 4, 2014

var ProcessPUT = {
payload: {
maxBytes:10485760,
mode: 'stream',
multipart:'stream'
},
handler: function (request, next) {
// All the processing is in there

   // For the uploading request we are expecting 100-continue 
   if(request.headers.expect === '100-continue')
    {
        console.log('In 100 continue');
        console.log(request.headers['content-length']);
        console.log(request.stream);
        console.log(request.raw.req._readableState);
     }
}

}

output of this comes out like -

In 100 continue
18171420
undefined
{ highWaterMark: 16384,
buffer: [],
length: 0,
pipes: null,
pipesCount: 0,
flowing: false,
ended: false,
endEmitted: false,
reading: false,
calledRead: false,
sync: true,
needReadable: false,
emittedReadable: false,
readableListening: false,
objectMode: false,
defaultEncoding: 'utf8',
ranOut: false,
awaitDrain: 0,
readingMore: false,
decoder: null,
encoding: null }

Once we check Buffer then size of buffer comes out to be 64K only.

@hueniverse hueniverse reopened this Jan 4, 2014
@hueniverse
Copy link
Contributor

Can you try this again with master? Note that 2.0 is very different in how payload is configured but I completely rewrote payload handling.

@ascaler
Copy link
Author

ascaler commented Jan 7, 2014

Thank you Eran, I am going to try that and will let you know. Is that any example or update in document to know new process... hope its there and will figure it out. Thanks again.

@hueniverse
Copy link
Contributor

@ascaler The docs/Reference.md is up to date.

@ghost ghost assigned hueniverse Jan 16, 2014
@hueniverse
Copy link
Contributor

Assuming it's fixed.

@dheerajsingh25
Copy link

Can you please provide me a working example to upload large file(as multipart) with expect-100 header?

@lock lock bot locked as resolved and limited conversation to collaborators Jan 13, 2020
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
support Questions, discussions, and general support
Projects
None yet
Development

No branches or pull requests

3 participants