-
-
Notifications
You must be signed in to change notification settings - Fork 15.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
On Big Endian platforms, the FullHttpRequest object always has a zero length content because of inconsistent hash values in CASE_INSENSITIVE_HASHER #5925
Comments
Hi! |
@Scottmitch PTAL |
@rz-itac - thanks for reporting. I'm not surprised there is a bug here as I was not able to get my hands on a big endian machine, and emulating a big endian machine was very painful IIRC. Let me take a deeper look. Just to confirm our unit tests would have picked up this failure if run on a big endian machine can you run the following and let me know if you get test failures on your Solaris machine (assuming you have java 7, git, and maven 3.1.1+ installed) git clone git@github.com:netty/netty.git
cd netty/common
mvn test |
Motivation: PlatformDependent has a hash code algorithm which utilizes UNSAFE for performance reasons. This hash code algorithm must also be consistent with CharSequence objects that represent a collection of ASCII characters. In order to make the UNSAFE versions and CharSequence versions the endianness should be taken into account. However the big endian code was not correct in a few places. Modifications: - Correct bugs in PlatformDependent class related to big endian ASCII hash code computation Result: Fixes netty#5925
Motivation: PlatformDependent has a hash code algorithm which utilizes UNSAFE for performance reasons. This hash code algorithm must also be consistent with CharSequence objects that represent a collection of ASCII characters. In order to make the UNSAFE versions and CharSequence versions the endianness should be taken into account. However the big endian code was not correct in a few places. Modifications: - Correct bugs in PlatformDependent class related to big endian ASCII hash code computation Result: Fixes netty#5925
Netty version: netty-all-4.1.5.Final.jar
On Big Endian platforms, the FullHttpRequest object always has a zero length content. I've been able to trace this in the debugger to the CASE_INSENSITIVE_HASHER in io.netty.util.AsciiString.java not returning consistent hashes for the "content-length" field on Big Endian platforms.
Some background: I'm using Netty to process REST requests from clients.
As part of the channel initializer, I'm adding a new HttpServerCodec() to the channel pipeline.
ch.pipeline.addLast(new HttpServerCodec());
What I've noticed is that on Big Endian platforms (specifically, Solaris sparc 5.10 and HP-UX-IA64 B.11.31), the Content-Length header is not being stored and retrieved consistently. This has the effect that Netty never reads the content from the HTTP request from the client.
Using the debugger, I've been able to track this down to inconsistent hash values in the CASE_INSENSITIVE_HASHER in io.netty.util.AsciiString on Big Endian machines (this issue does not appear on Little Endian machines).
When reading the headers from the HTTP client request --
HttpRequestDecoder(HttpObjectDecoder).readHeaders(ByteBuf) line: 584 --
the header gets added as "Content-Length" (mixed case). The hash value computed for this in the DeafultHeaders.add function is 2100908492.
Later, when this Content-Length header is retrieved by HttpUtil.getContentLength(HttpMessage, long) line: 185, it is looked up in all lower case as "content-length" and the hash value computed in DefaultHeaders.get is 193356951.
Since the hash value used on the "get" doesn't match what was used on the "add", the content-length value is not found. This results in the Netty failing to read the content from the socket and a FullHttpRequest object with 0 length content passed to my channel handler.
The text was updated successfully, but these errors were encountered: