Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Upgrade Netty to 4.1.x #6417

Merged
merged 3 commits into from
Oct 5, 2018
Merged

Upgrade Netty to 4.1.x #6417

merged 3 commits into from
Oct 5, 2018

Conversation

drcrallen
Copy link
Contributor

@drcrallen drcrallen commented Oct 4, 2018

Spark 2.3.0 upgrades Netty support to a 4.1 branch as per apache/spark#19884 . This PR upgrades various dependencies to all use Netty 4.1.x.

Solves #4390

((assuming the extension upgrades to spark 2.3 or greater))

@drcrallen drcrallen added the HTTP label Oct 4, 2018
@drcrallen
Copy link
Contributor Author

This is being proposed because I'm doing some gRPC extensions internally. And the gRPC java libraries require netty 4.1

@leventov
Copy link
Member

leventov commented Oct 4, 2018

#5059 no longer an issue?

@drcrallen
Copy link
Contributor Author

@leventov I don't have a great way to tell right now (we don't use http emitter internally) but looking at #4973 (comment) I was able to pinpoint the offending library causing all kinds of netty versions to be brought in. It was upgraded to a version that uses netty 4.1.29-Final

@drcrallen
Copy link
Contributor Author

drcrallen commented Oct 4, 2018

The dependencies now look like this:

https://gist.github.com/drcrallen/848d56a9aa409e15aab350b63eed7bd5

@drcrallen
Copy link
Contributor Author

drcrallen commented Oct 4, 2018

Noteably the following still have "old" netty dependencies:

  • com.alibaba.rocketmq:rocketmq-remoting (added exclusion)
  • org.apache.hadoop:hadoop-hdfs (but only in tests)

@drcrallen
Copy link
Contributor Author

I'm going to explicitly exclude the rocket mq dependency on netty-all in the extension

@drcrallen
Copy link
Contributor Author

@himanshug do you happen to have a way to reproduce the prior error in the new build environment?

java util is part of the main druid now.

Copy link
Contributor

@gianm gianm left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM based on the fact that #4390 and #4973 don't seem as relevant (Spark upgraded its Netty version, and java-util has been inlined into Druid). We should keep an eye on it during release validation for 0.13.0.

@drcrallen
Copy link
Contributor Author

I put a Release Notes tag since we don't have a "might need more validation during release" kind of tag

@drcrallen drcrallen merged commit 1c4f787 into apache:master Oct 5, 2018
@drcrallen drcrallen deleted the netty/4p1 branch October 5, 2018 19:30
@leventov
Copy link
Member

leventov commented Oct 5, 2018

@drcrallen please don't merge such important PRs with just one approve and without waiting for 3 working days (so that everybody has a chance to review it), as the current policy prescribes. I would say this PR should have been labelled Design Review

@drcrallen
Copy link
Contributor Author

Whop, sorry about that

@drcrallen
Copy link
Contributor Author

On a side not, I don't see how a version change qualifies as a design review. Can you explain the thought process there more?

@leventov
Copy link
Member

leventov commented Oct 5, 2018

I mean, this is an important decision that more than one party (apart from the PR author) should agree with. At Metamarkets we decided that we could upgrade to Netty 4.1, but already after this PR was merged. And we still don't know what people who care about Hadoop support (@nishantmonu51 / @b-slim) think

@gianm gianm added this to the 0.13.0 milestone Oct 6, 2018
@gianm
Copy link
Contributor

gianm commented Oct 6, 2018

FWIW we will definitely be putting this through the paces hadoop-support wise before voting on the 0.13.0 release. So if it causes anything bad to happen then we will speak up then. @nishantmonu51 / @b-slim probably will too.

@nishantmonu51
Copy link
Member

+1 from my side, If we face compatibility issues in our internal testing with the RC, will raise this again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

4 participants