Skip to content

[Improvement]: The Spark optimizer can still function properly even in the presence of class conflicts #3012

@zhoujinsong

Description

@zhoujinsong

Search before asking

  • I have searched in the issues and found no similar issues.

What would you like to be improved?

The Spark optimizer job includes all the dependencies required for task execution. However, when there is a conflict between the existing libraries in Spark Jars and the dependencies of the task, the Spark optimizer cannot function properly.

We should fix this issue because Spark Jars may typically contain additional dependencies, such as the commonly used iceberg-spark-runtime package.

How should we improve?

Add spark.executor.userClassPathFirst and spark.driver.userClassPathFirst parameter support for spark optimizer.

Are you willing to submit PR?

  • Yes I am willing to submit a PR!

Subtasks

No response

Code of Conduct

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions