-
Notifications
You must be signed in to change notification settings - Fork 28.9k
[SPARK-54375][CONNECT][TESTS] Add assume to cases in PythonPipelineSuite to skip tests when PyConnect dependencies is not available
#53088
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
Test first |
…eSuite` to skip tests when PyConnect dependencies is not available ### What changes were proposed in this pull request? SPARK-54020 added some new test cases in `PythonPipelineSuite`. This pr incorporates `assume(PythonTestDepsChecker.isConnectDepsAvailable)` for these test cases to ensure that the tests are skipped rather than failing when PyConnect dependencies are missing. ### Why are the changes needed? Enhance the robustness of test cases. Prior to this, when executing `build/sbt "connect/testOnly org.apache.spark.sql.connect.pipelines.PythonPipelineSuite"`: ``` [info] - reading internal datasets outside query function that trigger eager analysis or execution will fail (spark.sql("SELECT * FROM src")) *** FAILED *** (4 milliseconds) [info] "org.apache.spark.sql.connect.PythonTestDepsChecker.isConnectDepsAvailable was false" did not contain "TABLE_OR_VIEW_NOT_FOUND" (PythonPipelineSuite.scala:546) [info] org.scalatest.exceptions.TestFailedException: [info] at org.scalatest.Assertions.newAssertionFailedException(Assertions.scala:472) [info] at org.scalatest.Assertions.newAssertionFailedException$(Assertions.scala:471) [info] at org.scalatest.Assertions$.newAssertionFailedException(Assertions.scala:1231) [info] at org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:1295) [info] at org.apache.spark.sql.connect.pipelines.PythonPipelineSuite.$anonfun$new$43(PythonPipelineSuite.scala:546) [info] at org.apache.spark.sql.connect.pipelines.PythonPipelineSuite.$anonfun$new$43$adapted(PythonPipelineSuite.scala:532) [info] at org.apache.spark.SparkFunSuite.$anonfun$gridTest$2(SparkFunSuite.scala:241) [info] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.scala:18) [info] at org.scalatest.enablers.Timed$$anon$1.timeoutAfter(Timed.scala:127) [info] at org.scalatest.concurrent.TimeLimits$.failAfterImpl(TimeLimits.scala:282) [info] at org.scalatest.concurrent.TimeLimits.failAfter(TimeLimits.scala:231) [info] at org.scalatest.concurrent.TimeLimits.failAfter$(TimeLimits.scala:230) ... [info] *** 24 TESTS FAILED *** [error] Failed tests: [error] org.apache.spark.sql.connect.pipelines.PythonPipelineSuite [error] (connect / Test / testOnly) sbt.TestsFailedException: Tests unsuccessful ``` ### Does this PR introduce _any_ user-facing change? No ### How was this patch tested? - Pass Github Actions - Manually verify that the relevant tests will no longer fail when PyConnect dependencies are missing. ### Was this patch authored or co-authored using generative AI tooling? No Closes #53088 from LuciferYang/SPARK-54375. Authored-by: yangjie01 <yangjie01@baidu.com> Signed-off-by: yangjie01 <yangjie01@baidu.com> (cherry picked from commit 722bcc0) Signed-off-by: yangjie01 <yangjie01@baidu.com>
|
Merged into master and branch-4.1. Thanks @zhengruifeng |
|
Thank you, @LuciferYang and @zhengruifeng ! cc @SCHJonathan and @sryza , too. |
|
LGTM, though I wonder if there's some way to implement this that doesn't require remembering to add it to every single test? |
@sryza We can override the function |
What changes were proposed in this pull request?
SPARK-54020 added some new test cases in
PythonPipelineSuite. This pr incorporatesassume(PythonTestDepsChecker.isConnectDepsAvailable)for these test cases to ensure that the tests are skipped rather than failing when PyConnect dependencies are missing.Why are the changes needed?
Enhance the robustness of test cases. Prior to this, when executing
build/sbt "connect/testOnly org.apache.spark.sql.connect.pipelines.PythonPipelineSuite":Does this PR introduce any user-facing change?
No
How was this patch tested?
Was this patch authored or co-authored using generative AI tooling?
No