Skip to content

Commit

Permalink
[SPARK-4183] Enable NettyBlockTransferService by default
Browse files Browse the repository at this point in the history
Note that we're turning this on for at least the first part of the QA period as a trial. We want to enable this (and deprecate the NioBlockTransferService) as soon as possible in the hopes that NettyBlockTransferService will be more stable and easier to maintain. We will turn it off if we run into major issues.

Author: Aaron Davidson <aaron@databricks.com>

Closes apache#3049 from aarondav/enable-netty and squashes the following commits:

bb981cc [Aaron Davidson] [SPARK-4183] Enable NettyBlockTransferService by default
  • Loading branch information
aarondav authored and pwendell committed Nov 1, 2014
1 parent 7136719 commit 59e626c
Show file tree
Hide file tree
Showing 2 changed files with 11 additions and 1 deletion.
2 changes: 1 addition & 1 deletion core/src/main/scala/org/apache/spark/SparkEnv.scala
Original file line number Diff line number Diff line change
Expand Up @@ -274,7 +274,7 @@ object SparkEnv extends Logging {
val shuffleMemoryManager = new ShuffleMemoryManager(conf)

val blockTransferService =
conf.get("spark.shuffle.blockTransferService", "nio").toLowerCase match {
conf.get("spark.shuffle.blockTransferService", "netty").toLowerCase match {
case "netty" =>
new NettyBlockTransferService(conf)
case "nio" =>
Expand Down
10 changes: 10 additions & 0 deletions docs/configuration.md
Original file line number Diff line number Diff line change
Expand Up @@ -359,6 +359,16 @@ Apart from these, the following properties are also available, and may be useful
map-side aggregation and there are at most this many reduce partitions.
</td>
</tr>
<tr>
<td><code>spark.shuffle.blockTransferService</code></td>
<td>netty</td>
<td>
Implementation to use for transferring shuffle and cached blocks between executors. There
are two implementations available: <code>netty</code> and <code>nio</code>. Netty-based
block transfer is intended to be simpler but equally efficient and is the default option
starting in 1.2.
</td>
</tr>
</table>

#### Spark UI
Expand Down

0 comments on commit 59e626c

Please sign in to comment.