-
Notifications
You must be signed in to change notification settings - Fork 971
Enable authZ compile support for Spark 4.0 and refactor some test methods #7256
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
| protected val sql: String => DataFrame = spark.sql | ||
|
|
||
| protected def doAs[T](user: String, f: => T): T = { | ||
| protected def doAs[T](user: String, f: => T, unused: String = ""): T = { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
to keep both, I have to add a dummy parameter, otherwise, the compiler complains
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/test/scala/org/apache/kyuubi/plugin/spark/authz/SparkSessionProvider.scala:94: double definition:
protected def doAs[T](user: String, f: => T): T at line 87 and
protected def doAs[T](user: String)(f: => T): T at line 94
have same type after erasure: (user: String, f: Function0): Object
| case (t, "table") => doAs( | ||
| admin, { | ||
| val purgeOption = | ||
| if (isCatalogSupportPurge( | ||
| spark.sessionState.catalogManager.currentCatalog.name())) { | ||
| "PURGE" | ||
| } else "" | ||
| sql(s"DROP TABLE IF EXISTS $t $purgeOption") | ||
| }) | ||
| case (t, "table") => doAs(admin) { | ||
| val purgeOption = if (supportPurge) "PURGE" else "" | ||
| sql(s"DROP TABLE IF EXISTS $t $purgeOption") | ||
| } |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
def doAs[T](user: String)(f: => T) has more pretty format here.
Codecov Report❌ Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #7256 +/- ##
======================================
Coverage 0.00% 0.00%
======================================
Files 696 696
Lines 43530 43530
Branches 5883 5883
======================================
Misses 43530 43530 ☔ View full report in Codecov by Sentry. 🚀 New features to boost your workflow:
|
|
Thanks, merged to master/1.10 |
…r some test methods
This PR enables authZ compile support for Spark 4.0
```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```
```
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:19: object Strategy is not a member of package org.apache.spark.sql
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/rule/rowfilter/FilterDataSourceV2Strategy.scala:23: not found: type Strategy
[ERROR] [Error] /Users/chengpan/Projects/apache-kyuubi/extensions/spark/kyuubi-spark-authz/src/main/scala/org/apache/kyuubi/plugin/spark/authz/ranger/RangerSparkExtension.scala:58: type mismatch;
found : org.apache.kyuubi.plugin.spark.authz.rule.rowfilter.FilterDataSourceV2Strategy.type
required: v1.StrategyBuilder
(which expands to) org.apache.spark.sql.SparkSession => org.apache.spark.sql.execution.SparkStrategy
[ERROR] three errors found
```
In addition, it refactors two methods in the test helper class `SparkSessionProvider`
1. Refactor `isCatalogSupportPurge` to an abstract method `supportPurge` because some UTs do not rely on the current catalog.
2. Add a new helper method `def doAs[T](user: String)(f: => T): T`, now the caller can use it
```
doAs("someone") {
...
}
```
Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile
```
build/mvn -Pspark-4.0 -Pscala-2.13 -pl extensions/spark/kyuubi-spark-authz -am install -DskipTests
```
No.
Closes #7256 from pan3793/authz-refactor.
Closes #7256
b84cec8 [Cheng Pan] add missing override
ede364f [Cheng Pan] Enable authZ compile support for Spark 4.0 and refactor some test methods
Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Cheng Pan <chengpan@apache.org>
(cherry picked from commit 8a67796)
Signed-off-by: Cheng Pan <chengpan@apache.org>
Why are the changes needed?
This PR enables authZ compile support for Spark 4.0
In addition, it refactors two methods in the test helper class
SparkSessionProviderisCatalogSupportPurgeto an abstract methodsupportPurgebecause some UTs do not rely on the current catalog.def doAs[T](user: String)(f: => T): T, now the caller can use itHow was this patch tested?
Pass GHA to ensure it breaks nothing, manually tested Spark 4.0 compile
Was this patch authored or co-authored using generative AI tooling?
No.