New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix limiting of chunk size in the NoCoding encoder #2252

Merged
merged 1 commit into from Oct 11, 2018

Conversation

Projects
None yet
4 participants
@ollyw
Copy link
Contributor

ollyw commented Oct 11, 2018

This fixes an issue where the limitByteChunksStage function used by the NoCoding encoder was losing data if larger than the maximum chunk size. The tests have also been fixed to cover this scenario.

I don't know if this is an issue that affects people using the akka-http in the wild, but is quite subtle. I came across the issue trying to test out chunked StreamRefs in akka-cluster in a WIP PR akka/akka#25643. That PR copies the broken code from akka-http verbatim, so is likely to be also broken.

Fix limiting of chunk size in the NoCode encoder
The limitByteChunksStage function was losing data if larger than the
maximum chunk size. The tests have also been fixed to cover this
scenario
@akka-ci

This comment has been minimized.

Copy link
Collaborator

akka-ci commented Oct 11, 2018

Thank you for your pull request! After a quick sanity check one of the team will reply with 'OK TO TEST' to kick off our automated validation on Jenkins. This compiles the project, runs the tests, and checks for things like binary compatibility and source code formatting. When two team members have also manually reviewed and (perhaps after asking for some amendments) accepted your contribution, it should be good to be merged.

For more details about our contributing process, check out CONTRIBUTING.md - and feel free to ask!

@johanandren

This comment has been minimized.

Copy link
Member

johanandren commented Oct 11, 2018

OK TO TEST

@akka-ci akka-ci added validating tested and removed validating labels Oct 11, 2018

@akka-ci

This comment has been minimized.

Copy link
Collaborator

akka-ci commented Oct 11, 2018

Test PASSed.

@jrudolph
Copy link
Member

jrudolph left a comment

LGTM, not sure what I was thinking when this test/code was added in https://github.com/akka/akka/pull/16517/files.

@@ -113,20 +112,22 @@ abstract class CoderSpec extends WordSpec with CodecSpecSupport with Inspectors
}

"shouldn't produce huge ByteStrings for some input" in {
val array = new Array[Byte](10) // FIXME

This comment has been minimized.

@jrudolph

jrudolph Oct 11, 2018

Member

Add least I left a FIXME 🙄

val array = new Array[Byte](10) // FIXME
util.Arrays.fill(array, 1.toByte)
val array = new Array[Byte](100007)
val random = ThreadLocalRandom.current()

This comment has been minimized.

@jrudolph

jrudolph Oct 11, 2018

Member

Why random data? I guess it doesn't really matter but the original test tried (eh, very hard) to emulate a zip bomb which would use well-compressible data.

This comment has been minimized.

@ollyw

ollyw Oct 11, 2018

Contributor

I chose random data, as I wanted to detect frames being resend, etc, so each frame needed to be unique. I didn't fancy creating an array with 100007 bytes of data with unique frames manually. I could have done some of sequence I suppose.

This comment has been minimized.

@jrudolph

jrudolph Oct 11, 2018

Member

I see, thanks for the explanation and the fix!

@jrudolph jrudolph merged commit e352fe3 into akka:master Oct 11, 2018

3 checks passed

Jenkins PR Validation Test PASSed.
Details
continuous-integration/travis-ci/pr The Travis CI build passed
Details
typesafe-cla-validator All users have signed the CLA
Details

@jrudolph jrudolph added this to the 10.1.6 milestone Oct 11, 2018

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment