Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[FEATURE] Publish docker artifacts for multiple Spark versions with each release #1982

Open
peter-mcclonski opened this issue Apr 16, 2024 · 2 comments
Labels
enhancement New feature or request

Comments

@peter-mcclonski
Copy link
Contributor

Community Note

  • Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
  • Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
  • If you are interested in working on this issue or have submitted a pull request, please leave a comment

What is the outcome that you are trying to reach?

Currently, each release of Spark Operator is associated with exactly one version of Spark-- Generally the latest minor/patch version. Our organization is using Spark 3.4.2-- released Nov. '23. We would like to use the most recent fixes and features present in Spark Operator without being forced to use a single, specific version of Spark.

Describe the solution you would like

In particular, we would like to see releases of Spark Operator include support for multiple versions of Spark. The length of support is likely a point for debate, but we would propose supporting the most recent patch version of the three most recent minor versions.

Describe alternatives you have considered

Alternatives include either forcing a project to align their spark version with the spark-operator spark version, or forcing the project to build and maintain their own version of the spark-operator docker image.

Additional context

@peter-mcclonski peter-mcclonski added the enhancement New feature or request label Apr 16, 2024
@peter-mcclonski
Copy link
Contributor Author

I would be happy to work on this, if the feature is desired.

@peter-mcclonski
Copy link
Contributor Author

Key questions prior to working:

  • What criteria should be used to determine which Spark version(s) to support with a given release?
  • What criteria should be used to determine the "default" Spark version in the helm chart?

peter-mcclonski added a commit to peter-mcclonski/spark-on-k8s-operator that referenced this issue May 10, 2024
…ons on release

Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com>
Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
peter-mcclonski added a commit to peter-mcclonski/spark-on-k8s-operator that referenced this issue May 10, 2024
…ons on release

Signed-off-by: Peter Jablonski <mcclonski.peter@gmail.com>
Signed-off-by: Peter McClonski <mcclonski.peter@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant