Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infer Number of Available Executors on Different Environments #74

Closed
moritzmeister opened this issue Feb 2, 2021 · 0 comments
Closed
Assignees

Comments

@moritzmeister
Copy link
Contributor

If we want to support Maggy on arbitrary Spark clusters, we need a reliable way of inferring the number of executors that are available on the Spark cluster.

Additionally, as a fallback the user should be able to specify a lower number of executors in case it's a shared cluster.

We have to look at the following properties:
spark.dynamicAllocation.maxExecutors
spark.executor.instances for static

Find equivalents on Databricks.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants