From 1617eaded434069a38cd26cb1335d3fea2501bb0 Mon Sep 17 00:00:00 2001 From: Jiaan Geng Date: Sun, 10 Apr 2022 20:36:58 -0700 Subject: [PATCH] [SPARK-38391][SPARK-38768][SQL][FOLLOWUP] Add comments for `pushLimit` and `pushTopN` of `PushDownUtils` ### What changes were proposed in this pull request? `pushLimit` and `pushTopN` of `PushDownUtils` returns tuple of boolean. It will be good to explain what the boolean value represents. ### Why are the changes needed? Make DS V2 API more friendly to developers. ### Does this PR introduce _any_ user-facing change? 'No'. Just update comments. ### How was this patch tested? N/A Closes #36092 from beliefer/SPARK-38391_SPARK-38768_followup. Authored-by: Jiaan Geng Signed-off-by: Dongjoon Hyun (cherry picked from commit c4397cb3dee4f9fa16297c224da15475b2d5a297) Signed-off-by: Dongjoon Hyun --- .../sql/execution/datasources/v2/PushDownUtils.scala | 12 ++++++++++-- 1 file changed, 10 insertions(+), 2 deletions(-) diff --git a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala index f72310b5d7afa..862189ed3afff 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/PushDownUtils.scala @@ -116,7 +116,11 @@ object PushDownUtils extends PredicateHelper { } /** - * Pushes down LIMIT to the data source Scan + * Pushes down LIMIT to the data source Scan. + * + * @return the tuple of Boolean. The first Boolean value represents whether to push down, and + * the second Boolean value represents whether to push down partially, which means + * Spark will keep the Limit and do it again. */ def pushLimit(scanBuilder: ScanBuilder, limit: Int): Boolean = { scanBuilder match { @@ -127,7 +131,11 @@ object PushDownUtils extends PredicateHelper { } /** - * Pushes down top N to the data source Scan + * Pushes down top N to the data source Scan. + * + * @return the tuple of Boolean. The first Boolean value represents whether to push down, and + * the second Boolean value represents whether to push down partially, which means + * Spark will keep the Sort and Limit and do it again. */ def pushTopN( scanBuilder: ScanBuilder,