New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Spark-22967][TESTS]Fix VersionSuite's unit tests by change Windows path into URI path #20199
Conversation
@@ -58,7 +58,7 @@ class VersionsSuite extends SparkFunSuite with Logging { | |||
*/ | |||
protected def withTempDir(f: File => Unit): Unit = { | |||
val dir = Utils.createTempDir().getCanonicalFile | |||
try f(dir) finally Utils.deleteRecursively(dir) | |||
f(dir) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Leave deletion work to ShutdownHookManager to avoid delete IOException caused by 'file occupation in other program' error on Windows. (SEE SPARK-22967)
And temp dirs will be cleaned up after unit test completed, but this is only guaranteed for test(s"$version: SPARK-17920: Insert into/overwrite avro table"). And a lot of temp dirs produced by some other unit tests will still remains on Windows for unclear reason, maybe 'file occupation in other program ' too.
cc @HyukjinKwon |
ok to test |
Will take a look soon. |
Test build #85847 has finished for PR 20199 at commit
|
Let's fix the PR title to |
@@ -58,7 +58,7 @@ class VersionsSuite extends SparkFunSuite with Logging { | |||
*/ | |||
protected def withTempDir(f: File => Unit): Unit = { | |||
val dir = Utils.createTempDir().getCanonicalFile | |||
try f(dir) finally Utils.deleteRecursively(dir) | |||
f(dir) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hm, actually, shall we just skip the failing tests for now (assume(!Utils.isWindows)
)? Given what we talked, only few tests are failed in some few conditions when Hive version is 0.12?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actually, only one(2nd test mentioned above). Probably, assume(!(Utils.isWindows && version == "0.12"))
would be ok. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yea, sounds good.
I met the similar case when I fixed some tests related with Kafka before - https://issues-test.apache.org/jira/browse/KAFKA-1194. This took me a while to debug and find this issue .. To me, I am fine if we fail to find the issue that causes the failure in Hive but might be worth to give a try to find the issue causing this failure and leave the link here. |
I can trigger a test on Windows. Let me leave the build: Build started: [SQL] |
Ok, will try. |
Build started: [SQL] |
Test build #85894 has finished for PR 20199 at commit
|
@@ -842,6 +842,7 @@ class VersionsSuite extends SparkFunSuite with Logging { | |||
} | |||
|
|||
test(s"$version: SPARK-17920: Insert into/overwrite avro table") { | |||
assume(!(Utils.isWindows && version == "0.12")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@Ngone51 Have you had a chance to check out the issue? If you can't find, let's leave a comment here saying like ... it's intendedly skipped because it fails on Windows.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@HyukjinKwon Ok, will comment.
And I have find serval issues which exactly describe the same problem:
- https://issues.apache.org/jira/browse/SPARK-12216
- https://issues.apache.org/jira/browse/SPARK-8333
- https://issues.apache.org/jira/browse/SPARK-18979
It seems this is a common issue exists on Windows and haven't been resolved yet. And I didn't find a accurate cause for the error from those issues. And now, I doubt this problem maybe related to URLClassLoader. And, I have reproduce the same issue after I did a experiment with URLClassLoader. Though, I'm not sure this is the real cause, yet. Working on this.
We can merge this pr to master after I add comment. And open a new pr of that issue(if fixed) or communicate under one of those issues. WDYT?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Nice. Yea, let's merge this one after adding a comment saying it's skipped because it's failed in the condition on Windows.
Then, see if we can fix the root cause, and then re-enable this test in the PR if everything goes well.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
Test build #85960 has finished for PR 20199 at commit
|
Merged to master and branch-2.3. |
…path into URI path ## What changes were proposed in this pull request? Two unit test will fail due to Windows format path: 1.test(s"$version: read avro file containing decimal") ``` org.apache.hadoop.hive.ql.metadata.HiveException: MetaException(message:java.lang.IllegalArgumentException: Can not create a Path from an empty string); ``` 2.test(s"$version: SPARK-17920: Insert into/overwrite avro table") ``` Unable to infer the schema. The schema specification is required to create the table `default`.`tab2`.; org.apache.spark.sql.AnalysisException: Unable to infer the schema. The schema specification is required to create the table `default`.`tab2`.; ``` This pr fix these two unit test by change Windows path into URI path. ## How was this patch tested? Existed. Please review http://spark.apache.org/contributing.html before opening a pull request. Author: wuyi5 <ngone_5451@163.com> Closes #20199 from Ngone51/SPARK-22967. (cherry picked from commit 0552c36) Signed-off-by: hyukjinkwon <gurwls223@gmail.com>
What changes were proposed in this pull request?
Two unit test will fail due to Windows format path:
1.test(s"$version: read avro file containing decimal")
2.test(s"$version: SPARK-17920: Insert into/overwrite avro table")
This pr fix these two unit test by change Windows path into URI path.
How was this patch tested?
Existed.
Please review http://spark.apache.org/contributing.html before opening a pull request.