New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Improvement][Spark] Support Local Spark Cluster #15548
Comments
I would like to have a try on this issue. |
Current workaround for me is to pass
|
Thanks @git-blame for quick work around, indeed it will work in the extra options, but master is a important parameter among spark as mentioned. I will communicate with community to see if it is by design in previous discussions. If not, I will add paramater into spark task. |
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs. |
still working |
This issue has been automatically marked as stale because it has not had recent activity for 30 days. It will be closed in next 7 days if no further activity occurs. |
still working |
Search before asking
Description
When a Spark Task executes spark-submit, the 'cluster' and 'client' deploy maps to
--master yarn
or--master k8s://...
. I would like option to use a local Spark cluster. In other words, the equivalent spark-submit option is:--master spark://<hostname>:<port>
Are you willing to submit a PR?
Code of Conduct
The text was updated successfully, but these errors were encountered: