Skip to content

Commit

Permalink
[apache#1567] fix(spark): Let Spark use its own NettyUtils
Browse files Browse the repository at this point in the history
  • Loading branch information
rickyma committed Mar 8, 2024
1 parent d994b27 commit 2fa1a11
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 0 deletions.
2 changes: 2 additions & 0 deletions client-spark/spark2-shaded/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -213,6 +213,8 @@
to="lib${rss.shade.native.packageName}_netty_transport_native_kqueue_aarch_64.jnilib"
type="glob"></mapper>
</move>
<!-- Delete NettyUtils, let Spark use its own NettyUtils -->
<delete dir="${project.build.directory}/unpacked/org/apache/spark/network"/>
<echo message="repackaging netty jar"></echo>
<jar destfile="${project.build.directory}/${project.artifactId}-${project.version}.jar"
basedir="${project.build.directory}/unpacked"/>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -39,6 +39,11 @@
/**
* copy from spark, In order to override the createPooledByteBufAllocator method, the property
* DEFAULT_TINY_CACHE_SIZE does not exist in netty>4.1.47.
*
* <p>Attention: This class is intended for use in the testing phase only and will not be included
* in the final packaged artifact for production environment. For production environment, Spark will
* use its own NettyUtils instead of this one. This is somewhat of a hack, but given that Spark 2.x
* doesn't update frequently, it's not much of an issue to proceed this way.
*/
public class NettyUtils {

Expand Down

0 comments on commit 2fa1a11

Please sign in to comment.