Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create several Spark jobs for testing SparkConf/SparkContext #50

Closed
KenSuenobu opened this issue May 20, 2018 · 1 comment
Closed

Create several Spark jobs for testing SparkConf/SparkContext #50

KenSuenobu opened this issue May 20, 2018 · 1 comment
Labels
difficult Feature that may be difficult to implement distribution Distributed architecture work testing Testing for feature complete
Milestone

Comments

@KenSuenobu
Copy link
Owner

This will go along with testing, meaning, CircleCI needs to start an instance of Spark as part of the workflow step. It will need to kick off a Spark server instance as a Docker image, then allow for tests to run. Once tests have been completed, it will need to signal to the Spark instance that the jobs are done. (Maybe a write-up on a blog somewhere to explain how to do this well?)

@KenSuenobu KenSuenobu added testing Testing for feature complete difficult Feature that may be difficult to implement distribution Distributed architecture work labels May 20, 2018
@KenSuenobu KenSuenobu added this to the 0.2.0 milestone May 20, 2018
@KenSuenobu
Copy link
Owner Author

Code completed; converted RealWorldTest to Spark. (You can see the test code is MUCH larger than the original RealWorldTest code.)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
difficult Feature that may be difficult to implement distribution Distributed architecture work testing Testing for feature complete
Projects
None yet
Development

No branches or pull requests

1 participant