Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Spark Connect support for dagster #19782

Open
MrPowers opened this issue Feb 13, 2024 · 0 comments
Open

Spark Connect support for dagster #19782

MrPowers opened this issue Feb 13, 2024 · 0 comments

Comments

@MrPowers
Copy link

What's the use case?

Spark Connect is a different Spark architecture that's now used by some vendor runtimes, like Databricks Serverless.

There are some breaking Spark changes with Spark Connect. For example, this code that accesses the sparkContext (self.spark_session.sparkContext) will not work.

You should be able to restructure the code so that it works with both traditional Spark & Spark Connect.

Ideas of implementation

You can install Spark Connect with pip install spark[connect] and see what's breaking. You can also just try out Dagster on a serverless Databricks cluster to see if it works as expected. Looks like dagster accepts Spark RDDs and those aren't supported by Spark Connect, so that might be a place the code will break.

Additional information

No response

Message from the maintainers

Impacted by this issue? Give it a 👍! We factor engagement into prioritization.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

1 participant