This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new a0a092f9a993 [SPARK-55746][SQL][TESTS] Fix unable to load custom 
metric object SupportedV1WriteMetric
a0a092f9a993 is described below

commit a0a092f9a9938d4e98d6adb5ab501475823f5e57
Author: Cheng Pan <[email protected]>
AuthorDate: Fri Feb 27 08:55:17 2026 -0800

    [SPARK-55746][SQL][TESTS] Fix unable to load custom metric object 
SupportedV1WriteMetric
    
    ### What changes were proposed in this pull request?
    
    Bug was introduced by SPARK-50315 
(https://github.com/apache/spark/pull/48867), won't fail the test, just causes 
lots of warning logs
    ```
    $ build/sbt "sql/testOnly *V1WriteFallbackSuite"
    ...
    18:06:25.108 WARN org.apache.spark.sql.execution.ui.SQLAppStatusListener: 
Unable to load custom metric object for class 
`org.apache.spark.sql.connector.SupportedV1WriteMetric`. Please make sure that 
the custom metric class is in the classpath and it has 0-arg constructor.
    org.apache.spark.SparkException: 
org.apache.spark.sql.connector.SupportedV1WriteMetric did not have a 
zero-argument constructor or a single-argument constructor that accepts 
SparkConf. Note: if the class is defined inside of another Scala class, then 
its constructors may accept an implicit parameter that references the enclosing 
class; in this case, you must define the class as a top-level class in order to 
prevent this extra parameter from breaking Spark's ability to find a valid con 
[...]
            at 
org.apache.spark.util.Utils$.$anonfun$loadExtensions$1(Utils.scala:2871)
            at scala.collection.immutable.List.flatMap(List.scala:283)
            at scala.collection.immutable.List.flatMap(List.scala:79)
            at org.apache.spark.util.Utils$.loadExtensions(Utils.scala:2853)
            at 
org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$3(SQLAppStatusListener.scala:220)
            at scala.Option.map(Option.scala:242)
            at 
org.apache.spark.sql.execution.ui.SQLAppStatusListener.$anonfun$aggregateMetrics$2(SQLAppStatusListener.scala:214)
    ... (repeat many times)
    ```
    
    ### Why are the changes needed?
    
    Fix UT.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Verified locally, no warnings printed after fixing.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #54544 from pan3793/SPARK-55746.
    
    Authored-by: Cheng Pan <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .../scala/org/apache/spark/sql/connector/V1WriteFallbackSuite.scala | 6 ++++--
 1 file changed, 4 insertions(+), 2 deletions(-)

diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/V1WriteFallbackSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/V1WriteFallbackSuite.scala
index e396232eb70f..d105b63dd78d 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/connector/V1WriteFallbackSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/connector/V1WriteFallbackSuite.scala
@@ -370,6 +370,10 @@ class InMemoryV1Provider
   }
 }
 
+case class SupportedV1WriteMetric(name: String, description: String) extends 
CustomSumMetric {
+  def this() = this("dummy", "")
+}
+
 class InMemoryTableWithV1Fallback(
     override val name: String,
     override val schema: StructType,
@@ -425,8 +429,6 @@ class InMemoryTableWithV1Fallback(
     }
 
     override def build(): V1Write = new V1Write {
-      case class SupportedV1WriteMetric(name: String, description: String) 
extends CustomSumMetric
-
       override def supportedCustomMetrics(): Array[CustomMetric] =
         Array(SupportedV1WriteMetric("numOutputRows", "Number of output rows"))
 


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to