Implement QuerystringParser the same way as MultipartParser #180

Open
andrewrk opened this Issue Oct 13, 2012 · 6 comments

Comments

Projects
None yet
3 participants
@andrewrk
Collaborator

andrewrk commented Oct 13, 2012

(imported from TODO)

@bcaller

This comment has been minimized.

Show comment Hide comment
@bcaller

bcaller Dec 12, 2013

Hello,
I've had a go at this… not ready for pull request yet as I haven't tested it, but is this along the right lines?
https://github.com/bcaller/node-formidable/blob/master/lib/querystring_parser.js

bcaller commented Dec 12, 2013

Hello,
I've had a go at this… not ready for pull request yet as I haven't tested it, but is this along the right lines?
https://github.com/bcaller/node-formidable/blob/master/lib/querystring_parser.js

@felixge

This comment has been minimized.

Show comment Hide comment
@felixge

felixge Dec 12, 2013

Owner

@bcaller yeah, the patch looks like it's going the right way. I can give more feedback if you open a PR. Anyway, I'm not super eager to land this at this point, as there hasn't been a compelling need for this over the years. Are you processing very large query strings.

Owner

felixge commented Dec 12, 2013

@bcaller yeah, the patch looks like it's going the right way. I can give more feedback if you open a PR. Anyway, I'm not super eager to land this at this point, as there hasn't been a compelling need for this over the years. Are you processing very large query strings.

@bcaller

This comment has been minimized.

Show comment Hide comment
@bcaller

bcaller Dec 12, 2013

Some query strings might be large but I'm more scared about reading massive possibly endless strings into memory so I want to enforce the max field length so I can just destroy the connection if someone sends a ridiculously large query string or the fields are not of the form I'm expecting. Perhaps there should be a minimum content length at which the parser switches from the current version to my streaming version if you're unsure about using it.

bcaller commented Dec 12, 2013

Some query strings might be large but I'm more scared about reading massive possibly endless strings into memory so I want to enforce the max field length so I can just destroy the connection if someone sends a ridiculously large query string or the fields are not of the form I'm expecting. Perhaps there should be a minimum content length at which the parser switches from the current version to my streaming version if you're unsure about using it.

@bcaller

This comment has been minimized.

Show comment Hide comment
@bcaller

bcaller Dec 12, 2013

@felixge but yeah I was planning on allowing huge query strings as an alternative method to multipart.

bcaller commented Dec 12, 2013

@felixge but yeah I was planning on allowing huge query strings as an alternative method to multipart.

@bcaller

This comment has been minimized.

Show comment Hide comment
@bcaller

bcaller Dec 12, 2013

Actually I think anyone should just be using multipart for super long strings. Perhaps this is unnecessary.

bcaller commented Dec 12, 2013

Actually I think anyone should just be using multipart for super long strings. Perhaps this is unnecessary.

@felixge

This comment has been minimized.

Show comment Hide comment
@felixge

felixge Dec 13, 2013

Owner

Actually I think anyone should just be using multipart for super long strings. Perhaps this is unnecessary.

Yes, I agree. Multipart is much better suited / optimized for sending large blocks of data.

A streaming parser won't really protect users if they end up using a high level API that relies on the results of the streaming parser being buffered. And if protection from long query strings is all we want/need, we can simply define maximum payload size.

So I'd say we close this unless we come up with a more compelling use case?

Anyway, thanks a bunch for making an effort to improve this library, it's much appreciated.

Owner

felixge commented Dec 13, 2013

Actually I think anyone should just be using multipart for super long strings. Perhaps this is unnecessary.

Yes, I agree. Multipart is much better suited / optimized for sending large blocks of data.

A streaming parser won't really protect users if they end up using a high level API that relies on the results of the streaming parser being buffered. And if protection from long query strings is all we want/need, we can simply define maximum payload size.

So I'd say we close this unless we come up with a more compelling use case?

Anyway, thanks a bunch for making an effort to improve this library, it's much appreciated.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment