Skip to content

Conversation

@pan3793
Copy link
Member

@pan3793 pan3793 commented Nov 24, 2025

Why are the changes needed?

This PR enables authZ compile support for Spark 4.0

build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:19: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:23: not found: type Strategy
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala:58: type mismatch;
 found   : org.apache.kyuubi.plugin.spark.authz.rule.rowfilter.FilterDataSourceV2Strategy.type
 required: v1.StrategyBuilder
    (which expands to)  org.apache.spark.sql.SparkSession => org.apache.spark.sql.execution.SparkStrategy
[ERROR] three errors found

In addition, it refactors two methods in the test helper class SparkSessionProvider

  1. Refactor isCatalogSupportPurge to an abstract method supportPurge because some UTs do not rely on the current catalog.
  2. Add a new helper method def doAs[T](user: String)(f: => T): T, now the caller can use it
    doAs("someone") {
      ...
    }
    

How was this patch tested?

Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile

build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests

Was this patch authored or co-authored using generative AI tooling?

No.

protected val sql: String => DataFrame = spark.sql

protected def doAs[T](user: String, f: => T): T = {
protected def doAs[T](user: String, f: => T, unused: String = ""): T = {
Copy link
Member Author

@pan3793 pan3793 Nov 24, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

to keep both, I have to add a dummy parameter, otherwise, the compiler complains

[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/SparkSessionProvider.scala:94: double definition:
protected def doAs[T](user: String, f: => T): T at line 87 and
protected def doAs[T](user: String)(f: => T): T at line 94
have same type after erasure: (user: String, f: Function0): Object

Comment on lines -99 to +109
case (t, "table") => doAs(
admin, {
val purgeOption =
if (isCatalogSupportPurge(
spark.sessionState.catalogManager.currentCatalog.name())) {
"PURGE"
} else ""
sql(s"DROP TABLE IF EXISTS $t $purgeOption")
})
case (t, "table") => doAs(admin) {
val purgeOption = if (supportPurge) "PURGE" else ""
sql(s"DROP TABLE IF EXISTS $t $purgeOption")
}
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

def doAs[T](user: String)(f: => T) has more pretty format here.

@codecov-commenter
Copy link

Codecov Report

❌ Patch coverage is 0% with 1 line in your changes missing coverage. Please review.
✅ Project coverage is 0.00%. Comparing base (f2539d2) to head (b84cec8).

Files with missing lines Patch % Lines
...hz/rule/rowfilter/FilterDataSourceV2Strategy.scala 0.00% 1 Missing ⚠️
Additional details and impacted files
@@          Coverage Diff           @@
##           master   #7256   +/-   ##
======================================
  Coverage    0.00%   0.00%           
======================================
  Files         696     696           
  Lines       43530   43530           
  Branches     5883    5883           
======================================
  Misses      43530   43530           

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.
  • 📦 JS Bundle Analysis: Save yourself from yourself by tracking and limiting bundle sizes in JS merges.

@pan3793 pan3793 self-assigned this Nov 24, 2025
@pan3793 pan3793 added this to the v1.10.3 milestone Nov 24, 2025
@pan3793 pan3793 closed this in 8a67796 Nov 24, 2025
@pan3793
Copy link
Member Author

pan3793 commented Nov 24, 2025

Thanks, merged to master/1.10

pan3793 added a commit that referenced this pull request Nov 24, 2025
…r some test methods

This PR enables authZ compile support for Spark 4.0

```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```

```
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:19: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:23: not found: type Strategy
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala:58: type mismatch;
 found   : org.apache.kyuubi.plugin.spark.authz.rule.rowfilter.FilterDataSourceV2Strategy.type
 required: v1.StrategyBuilder
    (which expands to)  org.apache.spark.sql.SparkSession => org.apache.spark.sql.execution.SparkStrategy
[ERROR] three errors found
```

In addition, it refactors two methods in the test helper class `SparkSessionProvider`

1. Refactor `isCatalogSupportPurge` to an abstract method `supportPurge` because some UTs do not rely on the current catalog.
2. Add a new helper method `def doAs[T](user: String)(f: => T): T`, now the caller can use it
   ```
   doAs("someone") {
     ...
   }
   ```

Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile

```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```

No.

Closes #7256 from pan3793/authz-refactor.

Closes #7256

b84cec8 [Cheng Pan] add missing override
ede364f [Cheng Pan] Enable authZ compile support for Spark 4.0 and refactor some test methods

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 8a67796)
Signed-off-by: Cheng Pan <chengpan@apache.org>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants