Skip to content
Permalink
Browse files

[SPARK-22221][SQL][FOLLOWUP] Externalize spark.sql.execution.arrow.ma…

…xRecordsPerBatch

## What changes were proposed in this pull request?

This is a followup to #19575 which added a section on setting max Arrow record batches and this will externalize the conf that was referenced in the docs.

## How was this patch tested?
NA

Author: Bryan Cutler <cutlerb@gmail.com>

Closes #20423 from BryanCutler/arrow-user-doc-externalize-maxRecordsPerBatch-SPARK-22221.

(cherry picked from commit f235df6)
Signed-off-by: gatorsmile <gatorsmile@gmail.com>
  • Loading branch information...
BryanCutler authored and gatorsmile committed Jan 30, 2018
1 parent 75131ee commit 2858eaafaf06d3b8c55a8a5ed7831260244932cd
Showing with 0 additions and 1 deletion.
  1. +0 −1 sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala
@@ -1051,7 +1051,6 @@ object SQLConf {

val ARROW_EXECUTION_MAX_RECORDS_PER_BATCH =
buildConf("spark.sql.execution.arrow.maxRecordsPerBatch")
.internal()
.doc("When using Apache Arrow, limit the maximum number of records that can be written " +
"to a single ArrowRecordBatch in memory. If set to zero or negative there is no limit.")
.intConf

0 comments on commit 2858eaa

Please sign in to comment.
You can’t perform that action at this time.