Skip to content

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode#19079

Closed
lgrcyanny wants to merge 1 commit intoapache:branch-2.2from
lgrcyanny:fix-yarn-files-problem
Closed

[SPARK-21859][CORE] Fix SparkFiles.get failed on driver in yarn-cluster and yarn-client mode#19079
lgrcyanny wants to merge 1 commit intoapache:branch-2.2from
lgrcyanny:fix-yarn-files-problem

Conversation

@lgrcyanny
Copy link

What changes were proposed in this pull request?

when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works fine.
we can reproduce the bug as follows:

val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
    val file = new File(SparkFiles.get(fileName))
    if (!file.exists()) {
        println(s"$file not exist")
    } else {
        // print file content on driver
        val content = Source.fromFile(file).getLines().mkString("\n")
        println(s"File content: ${content}")
    }
}
// the output will be file not exist
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
    file = SparkFiles.get(filename)
    print("file path: {}".format(file))
    if os.path.exists(file):
        with open(file) as f:
        lines = f.readlines()
        print(lines)
    else:
        print("file doesn't exist")
        run_command("ls .")

the output will be file not exist

How was this patch tested?

tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the test result
the testing commands are:

./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py

@AmplabJenkins
Copy link

Can one of the admins verify this patch?

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tiny nitpick, this might be simpler.

  if (isDriver && conf.get("spark.submit.deployMode", "client") == "client") { 

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, thanks, I will change it.

Copy link
Author

@lgrcyanny lgrcyanny Aug 30, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Originally, my version is

conf.get("spark.submit.deployMode", "client") == "client"

Then I refered to the SparkContext#deployMode function,
it use

conf.getOption("spark.submit.deployMode").getOrElse("client")

I just want to keep the same style as SparkContext.
which one is better?

…er and yarn-client mode

when use SparkFiles.get a file on driver in yarn-client or yarn-cluster, it will report file not found exception.
This exception only happens on driver, SparkFiles.get on executor works fine.
we can reproduce the bug as follows:
```scala
val conf = new SparkConf().setAppName("SparkFilesTest")
val sc = new SparkContext(conf)
def testOnDriver(fileName: String) = {
    val file = new File(SparkFiles.get(fileName))
    if (!file.exists()) {
        println(s"$file not exist")
    } else {
        // print file content on driver
        val content = Source.fromFile(file).getLines().mkString("\n")
        println(s"File content: ${content}")
    }
}
// the output will be file not exist
```

```python
conf = SparkConf().setAppName("test files")
sc = SparkContext(appName="spark files test")
def test_on_driver(filename):
    file = SparkFiles.get(filename)
    print("file path: {}".format(file))
    if os.path.exists(file):
        with open(file) as f:
        lines = f.readlines()
        print(lines)
    else:
        print("file doesn't exist")
        run_command("ls .")
```
the output will be file not exist

tested in integration tests and manual tests
submit the demo case in yarn-cluster and yarn-client mode, and verify the test result

```
./bin/spark-submit --master yarn-cluster --files README.md --class "testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-client --files README.md --class "testing.SparkFilesTest" testing.jar
./bin/spark-submit --master yarn-cluster --files README.md test_get_files.py
./bin/spark-submit --master yarn-client --files README.md test_get_files.py
```

Change-Id: I22034f99f571a451b862c1806b7f9350c6133c95
@lgrcyanny lgrcyanny force-pushed the fix-yarn-files-problem branch from 3f0e4a8 to 82b4f1a Compare August 30, 2017 01:26
@jerryshao
Copy link
Contributor

Currently for Spark yarn-client application, we don't support fetching files using above SparkFiles.get API. Since you already know where the file is in client mode, so may be you don't need to call SparkFiles.get.

Here actually requires several changes including supporting remote files.

OptionAssigner(args.totalExecutorCores, STANDALONE | MESOS, ALL_DEPLOY_MODES,
sysProp = "spark.cores.max"),
OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES,
OptionAssigner(args.files, ALL_CLUSTER_MGRS, ALL_DEPLOY_MODES,
Copy link
Contributor

@jerryshao jerryshao Aug 30, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The changes here is not correct. For yarn application, we use spark.yarn.dist.files to handle files, and this will be added to distributed cache. With your change it breaks the current code.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I met some users complained about the wired action about SparkFiles.get in yarn-client and yarn-cluster mode. SparkFiles.get is very easy for user to get file path. why not keep the same action in yarn-cluster and yarn-client mode?
Meanwhile, it not very easy for user to use spark.yarn.dist.files, it must be uploaded to HDFS in advance. To make spark on yarn more usable, use SparkFiles.get is better.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't say SparkFiles.get is not useful, I'm saying your fix is not correct, the changes here will break the original semantics. Also we recently support remote files, to handle this scenario we should think how to address this problem for all the cluster managers, not only in yarn client mode.

Copy link
Contributor

@jerryshao jerryshao Aug 30, 2017

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also as @vanzin mentioned, the description of "Yarn cluster mode SparkFiles.get is working" is not a design purpose. So to fix this issue I think you should have a more solid patch.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

hi @jerryshao About the remote files to handle yarn-client files problem, is there jira that explains the design? We can wait for a version which resolved the problem.

I think my fix just solve the problem simply, do you have any other idea to solve it more elegantly?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

May I ask, why the OptionAssigner for "spark.files" works for local, standalone and mesos, only except yarn? is there any doc explain the design purpose? or may be this is really a issue.

OptionAssigner(args.files, LOCAL | STANDALONE | MESOS, ALL_DEPLOY_MODES, "spark.files")

@vanzin
Copy link
Contributor

vanzin commented Aug 30, 2017

Please file PRs against the master branch.

@vanzin
Copy link
Contributor

vanzin commented Aug 30, 2017

(Unless the issue does not exist in the master branch, in which case please call that out explicitly.)

@lgrcyanny
Copy link
Author

Hi @vanzin I have submit a PR based on master branch, please review it, thank you
#19102

@srowen
Copy link
Member

srowen commented Sep 1, 2017

Close this @lgrcyanny

@jerryshao
Copy link
Contributor

Please close this PR @lgrcyanny thanks!

@lgrcyanny lgrcyanny closed this Sep 4, 2017
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants