Skip to content

Decompress limits#181

Closed
Corion wants to merge 13 commits intolibwww-perl:masterfrom
Corion:decompress-limits
Closed

Decompress limits#181
Corion wants to merge 13 commits intolibwww-perl:masterfrom
Corion:decompress-limits

Conversation

@Corion
Copy link
Copy Markdown

@Corion Corion commented Oct 8, 2022

This supercedes #42

Copy link
Copy Markdown
Contributor

@simbabque simbabque left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is very impressive work. Thank you for that. Could you please remove those leftover commented out debug messages?

I think we should also check why the Brotli test skips in CI.

Comment thread lib/HTTP/Message.pm Outdated
Comment thread lib/HTTP/Message.pm Outdated
Comment thread t/message-decode-brotlibomb.t Outdated
Corion pushed a commit to Corion/HTTP-Message that referenced this pull request Oct 8, 2022
@simbabque simbabque requested a review from oalders October 8, 2022 13:50
Corion pushed a commit to Corion/HTTP-Message that referenced this pull request Oct 8, 2022
@Corion Corion force-pushed the decompress-limits branch from debddbd to caf0b64 Compare October 8, 2022 13:53
Corion pushed a commit to Corion/HTTP-Message that referenced this pull request Oct 8, 2022
@Corion Corion force-pushed the decompress-limits branch from caf0b64 to 01a181e Compare October 8, 2022 14:06
@Corion Corion requested review from simbabque and removed request for oalders October 8, 2022 14:06
@Corion
Copy link
Copy Markdown
Author

Corion commented Oct 8, 2022

The Brotli test skips in CI because cpanm is run from the action without --with-recommends . I don't know enough about the Dockerfile setup and the Github actions used to know how to coax this into actually installing things.

@oalders
Copy link
Copy Markdown
Member

oalders commented Oct 9, 2022

The Brotli test skips in CI because cpanm is run from the action without --with-recommends . I don't know enough about the Dockerfile setup and the Github actions used to know how to coax this into actually installing things.

That also skips in master, so we don't need to address it here.

Comment thread t/message-decode-brotlibomb.t Outdated
Corion pushed a commit to Corion/HTTP-Message that referenced this pull request Oct 9, 2022
@oalders
Copy link
Copy Markdown
Member

oalders commented Oct 10, 2022

I will probably fix the merge conflict and look at merging this tomorrow, so if anyone has objections, now would be a good time to bring them up.

Max Maischein added 13 commits October 11, 2022 19:06
This creates one (more) copy of the content if we limit the output
because Zlib and Bzip2 want to remove the consumed input from the
input string.

Also, this moves away from IO::Uncompress::Gunzip and IO::Uncompress::Bzip2
in favour of Compress::Raw::Zlib and Compress::Raw::Bzip2 because I
found no way to convince IO::Uncompress::Gunzip::gunzip to pass through
the appropriate limiting options.

The API is extended (but not yet documented) in three ways:

1) A global variable, $HTTP::Message::MAX_BODY_SIZE to limit the
maximum size of ->decoded_content

2) An accessor, ->max_body_size, which can be set for individual
HTTP::Responses

3) An optional parameter to ->decoded_content, which certainly is the
most preferrable option but requires cooperation from all locations where
->decoded_content is called.
* Eliminate use of wantarray() in ->max_body_size
* Eliminate use of vars.pm
…Zlib

The Bufsize parameter was introduced in 2.060, so using 2.061 should
be fairly safe here
... mostly to pacify the gods of CI

# Conflicts:
#	Changes
@Corion Corion force-pushed the decompress-limits branch from 9039091 to 4fda7fc Compare October 11, 2022 17:09
@oalders
Copy link
Copy Markdown
Member

oalders commented Oct 12, 2022

Thanks so much for revisiting this, @Corion! I've rebased and merged at the command line. New release will be on CPAN shortly.

@oalders oalders closed this Oct 12, 2022
oalders pushed a commit that referenced this pull request Oct 12, 2022
Limit decoded body size by manually decoding the compressed content

This creates one (more) copy of the content if we limit the output
because Zlib and Bzip2 want to remove the consumed input from the
input string.

Also, this moves away from IO::Uncompress::Gunzip and IO::Uncompress::Bzip2
in favour of Compress::Raw::Zlib and Compress::Raw::Bzip2 because I
found no way to convince IO::Uncompress::Gunzip::gunzip to pass through
the appropriate limiting options.

The API is extended (but not yet documented) in three ways:

1) A global variable, $HTTP::Message::MAX_BODY_SIZE to limit the
maximum size of ->decoded_content

2) An accessor, ->max_body_size, which can be set for individual
HTTP::Responses

3) An optional parameter to ->decoded_content, which certainly is the
most preferrable option but requires cooperation from all locations where
->decoded_content is called.

Output the Compress::Raw::Zlib version in case a test fails

We might be fine with version 2.061...

Up our prerequisite to 2.061 for the time being...

Update META.json as well...

Amend changes

* Eliminate use of wantarray() in ->max_body_size
* Eliminate use of vars.pm

Also handle Brotli (de)compression

Reindent to match source

Only run zipbomb tests if we have a recent version of Compress::Raw::Zlib

The Bufsize parameter was introduced in 2.060, so using 2.061 should
be fairly safe here

Remove debugging comments, remove misleading comments

In #181 , #181 (review)

Add Changes blurb

... mostly to pacify the gods of CI
oalders pushed a commit that referenced this pull request Oct 18, 2022
Limit decoded body size by manually decoding the compressed content

This creates one (more) copy of the content if we limit the output
because Zlib and Bzip2 want to remove the consumed input from the
input string.

Also, this moves away from IO::Uncompress::Gunzip and IO::Uncompress::Bzip2
in favour of Compress::Raw::Zlib and Compress::Raw::Bzip2 because I
found no way to convince IO::Uncompress::Gunzip::gunzip to pass through
the appropriate limiting options.

The API is extended (but not yet documented) in three ways:

1) A global variable, $HTTP::Message::MAX_BODY_SIZE to limit the
maximum size of ->decoded_content

2) An accessor, ->max_body_size, which can be set for individual
HTTP::Responses

3) An optional parameter to ->decoded_content, which certainly is the
most preferrable option but requires cooperation from all locations where
->decoded_content is called.

Output the Compress::Raw::Zlib version in case a test fails

We might be fine with version 2.061...

Up our prerequisite to 2.061 for the time being...

Update META.json as well...

Amend changes

* Eliminate use of wantarray() in ->max_body_size
* Eliminate use of vars.pm

Also handle Brotli (de)compression

Reindent to match source

Only run zipbomb tests if we have a recent version of Compress::Raw::Zlib

The Bufsize parameter was introduced in 2.060, so using 2.061 should
be fairly safe here

Remove debugging comments, remove misleading comments

In #181 , #181 (review)

Add Changes blurb

... mostly to pacify the gods of CI
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants