Skip to content

[SPARK-55746][SQL][TESTS] Fix unable to load custom metric object SupportedV1WriteMetric#54544

Closed
pan3793 wants to merge 1 commit intoapache:masterfrom
pan3793:SPARK-55746
Closed

[SPARK-55746][SQL][TESTS] Fix unable to load custom metric object SupportedV1WriteMetric#54544
pan3793 wants to merge 1 commit intoapache:masterfrom
pan3793:SPARK-55746

Conversation

@pan3793
Copy link
Member

@pan3793 pan3793 commented Feb 27, 2026

What changes were proposed in this pull request?

Bug was introduced by SPARK-50315 (#48867), won't fail the test, just causes lots of warning logs

$ build/sbt "sql/testOnly *V1WriteFallbackSuite"
...
18:06:25.108 WARN org.apache.spark.sql.execution.ui.SQLAppStatusListener: Unable to load custom metric object for class `org.apache.spark.sql.connector.SupportedV1WriteMetric`. Please make sure that the custom metric class is in the classpath and it has 0-arg constructor.
org.apache.spark.SparkException: org.apache.spark.sql.connector.SupportedV1WriteMetric did not have a zero-argument constructor or a single-argument constructor that accepts SparkConf. Note: if the class is defined inside of another Scala class, then its constructors may accept an implicit parameter that references the enclosing class; in this case, you must define the class as a top-level class in order to prevent this extra parameter from breaking Spark's ability to find a valid constructor.
	at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2871)
	at scala.collection.immutable.List.flatMap(List.scala:283)
	at scala.collection.immutable.List.flatMap(List.scala:79)
	at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2853)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$3(SQLAppStatusListener.scala:220)
	at scala.Option.map(Option.scala:242)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$2(SQLAppStatusListener.scala:214)
... (repeat many times)

Why are the changes needed?

Fix UT.

Does this PR introduce any user-facing change?

No.

How was this patch tested?

Verified locally, no warnings printed after fixing.

Was this patch authored or co-authored using generative AI tooling?

No.

@pan3793 pan3793 changed the title [SPARK-55746] Fix unable to load custom metric object SupportedV1WriteMetric [SPARK-55746][SQL][TESTS] Fix unable to load custom metric object SupportedV1WriteMetric Feb 27, 2026
@pan3793 pan3793 requested review from peter-toth and viirya February 27, 2026 12:23
Copy link
Member

@dongjoon-hyun dongjoon-hyun left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

+1, LGTM.

cc @olaky and @cloud-fan

dongjoon-hyun pushed a commit that referenced this pull request Feb 27, 2026
…portedV1WriteMetric

### What changes were proposed in this pull request?

Bug was introduced by SPARK-50315 (#48867), won't fail the test, just causes lots of warning logs
```
$ build/sbt "sql/testOnly *V1WriteFallbackSuite"
...
18:06:25.108 WARN org.apache.spark.sql.execution.ui.SQLAppStatusListener: Unable to load custom metric object for class `org.apache.spark.sql.connector.SupportedV1WriteMetric`. Please make sure that the custom metric class is in the classpath and it has 0-arg constructor.
org.apache.spark.SparkException: org.apache.spark.sql.connector.SupportedV1WriteMetric did not have a zero-argument constructor or a single-argument constructor that accepts SparkConf. Note: if the class is defined inside of another Scala class, then its constructors may accept an implicit parameter that references the enclosing class; in this case, you must define the class as a top-level class in order to prevent this extra parameter from breaking Spark's ability to find a valid constructor.
	at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2871)
	at scala.collection.immutable.List.flatMap(List.scala:283)
	at scala.collection.immutable.List.flatMap(List.scala:79)
	at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2853)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$3(SQLAppStatusListener.scala:220)
	at scala.Option.map(Option.scala:242)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$2(SQLAppStatusListener.scala:214)
... (repeat many times)
```

### Why are the changes needed?

Fix UT.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Verified locally, no warnings printed after fixing.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #54544 from pan3793/SPARK-55746.

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit a0a092f)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
dongjoon-hyun pushed a commit that referenced this pull request Feb 27, 2026
…portedV1WriteMetric

### What changes were proposed in this pull request?

Bug was introduced by SPARK-50315 (#48867), won't fail the test, just causes lots of warning logs
```
$ build/sbt "sql/testOnly *V1WriteFallbackSuite"
...
18:06:25.108 WARN org.apache.spark.sql.execution.ui.SQLAppStatusListener: Unable to load custom metric object for class `org.apache.spark.sql.connector.SupportedV1WriteMetric`. Please make sure that the custom metric class is in the classpath and it has 0-arg constructor.
org.apache.spark.SparkException: org.apache.spark.sql.connector.SupportedV1WriteMetric did not have a zero-argument constructor or a single-argument constructor that accepts SparkConf. Note: if the class is defined inside of another Scala class, then its constructors may accept an implicit parameter that references the enclosing class; in this case, you must define the class as a top-level class in order to prevent this extra parameter from breaking Spark's ability to find a valid constructor.
	at org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2871)
	at scala.collection.immutable.List.flatMap(List.scala:283)
	at scala.collection.immutable.List.flatMap(List.scala:79)
	at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2853)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$3(SQLAppStatusListener.scala:220)
	at scala.Option.map(Option.scala:242)
	at org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$2(SQLAppStatusListener.scala:214)
... (repeat many times)
```

### Why are the changes needed?

Fix UT.

### Does this PR introduce _any_ user-facing change?

No.

### How was this patch tested?

Verified locally, no warnings printed after fixing.

### Was this patch authored or co-authored using generative AI tooling?

No.

Closes #54544 from pan3793/SPARK-55746.

Authored-by: Cheng Pan <chengpan@apache.org>
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
(cherry picked from commit a0a092f)
Signed-off-by: Dongjoon Hyun <dongjoon@apache.org>
@dongjoon-hyun
Copy link
Member

Since this is a test PR, I backported this to master/4.1/4.0 to help the testing Spark 4.x.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants