Skip to content

Why the supported PySpark version (2.4) is so outdated given spark 3.2 is available? #2984

@jtzhang17

Description

@jtzhang17

Very simple request: is it possible to keep track the most recent major version of PySpark and support it in PySparkProcessor? If the PySpark is so outdated in PySparkProcessor, I doubt if there are many users willing to use this PySparkProcessor.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions