Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Use spark-submit instead of compute-classpath.sh #164

Merged
merged 8 commits into from Jun 15, 2015

Conversation

velvia
Copy link
Contributor

@velvia velvia commented Jun 4, 2015

The hope is that this will solve a bunch of classpath issues that different folks have been seeing, since job server will launch as a regular Spark application.

You can try this branch out, but note that logging just goes to stdout right now. I haven't figured out how to make Spark use our own log4j config, short of copying our log4j.properties into the $SPARK_HOME/conf dir.

Also, @ankit1010 have a look at this. Two thoughts:

  1. Using spark-submit would be perfect for launching the forked context JVM. In fact, we may be able to use SparkSubmitDriverBootstrapper.scala, which is part of Spark and means we don't even need to deal with process forking ourselves.
  2. Outside of the JobManager, the job server doesn't really need to interact with Spark. Perhaps the main job server (web service) could just launch without Spark as a dependency, which would be much cleaner.

@velvia velvia mentioned this pull request Jun 4, 2015
velvia added a commit that referenced this pull request Jun 15, 2015
Use spark-submit instead of compute-classpath.sh
@velvia velvia merged commit dca77e4 into master Jun 15, 2015
@velvia velvia deleted the velvia/use-spark-submit branch June 15, 2015 16:41
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

1 participant