Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SPARK-3970] Remove duplicate removal of local dirs #2826

Closed
wants to merge 2 commits into from

Conversation

viirya
Copy link
Member

@viirya viirya commented Oct 16, 2014

The shutdown hook of DiskBlockManager would remove localDirs. So do not need to register them with Utils.registerShutdownDeleteDir. It causes duplicate removal of these local dirs and corresponding exceptions.

@viirya viirya changed the title Remove duplicate removal of local dirs [SPARK-3970] Remove duplicate removal of local dirs Oct 16, 2014
@@ -140,7 +140,6 @@ private[spark] class DiskBlockManager(blockManager: BlockManager, conf: SparkCon
}

private def addShutdownHook() {
localDirs.foreach(localDir => Utils.registerShutdownDeleteDir(localDir))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems undesirable to duplicate the delete logic but I can see that stop needs to trigger a delete. What exception happens ? Deleting something that isn't there should not be an erro .

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Because it tries to delete local dirs that are already removed, So it would cause exceptions as following:

ERROR DiskBlockManager: Exception while deleting local spark dir: /tmp/spark-local-20141016160713-d38f
java.io.IOException: Failed to list files for dir: /tmp/spark-local-20141016160713-d38f/1b
    at org.apache.spark.util.Utils$.listFilesSafely(Utils.scala:666)
    at org.apache.spark.util.Utils$.deleteRecursively(Utils.scala:680)
    .....

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah right. I really think deleteRecursively just shouldn't fail in this situation. If the dir can't be listed it should move on. Unfortunately null means "not a dir, or, an error occurred". So it may be resolvable by checking if the dir exists to begin with.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding a check to see if the dir exits in listFilesSafely helps. I add another commit for that. However, the duplicate delete logic is undesirable and should not be there too.

@andrewor14
Copy link
Contributor

ok to test

@SparkQA
Copy link

SparkQA commented Oct 17, 2014

QA tests have started for PR 2826 at commit 051d4b5.

  • This patch merges cleanly.

@SparkQA
Copy link

SparkQA commented Oct 17, 2014

QA tests have finished for PR 2826 at commit 051d4b5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor

LGTM, other comments @srowen?

@viirya
Copy link
Member Author

viirya commented Oct 26, 2014

Is this patch ok to merge?

@andrewor14
Copy link
Contributor

retest this please, just because it's been a while since jenkins last ran.

@SparkQA
Copy link

SparkQA commented Oct 26, 2014

Test build #22249 timed out for PR 2826 at commit 051d4b5 after a configured wait of 120m.

@andrewor14
Copy link
Contributor

retest this please

@SparkQA
Copy link

SparkQA commented Oct 26, 2014

Test build #22258 has finished for PR 2826 at commit 051d4b5.

  • This patch passes all tests.
  • This patch merges cleanly.
  • This patch adds no public classes.

@andrewor14
Copy link
Contributor

Ok I'm merging this

@asfgit asfgit closed this in 6377ada Oct 27, 2014
@viirya viirya deleted the fix_duplicate_localdir_remove branch December 27, 2023 18:16
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
4 participants