singhpk234 commented on code in PR #14854:
URL: https://github.com/apache/iceberg/pull/14854#discussion_r2657000939
##########
spark/v3.4/spark-extensions/src/test/java/org/apache/iceberg/spark/extensions/TestMetadataTables.java:
##########
@@ -990,4 +994,44 @@ public void metadataLogEntriesAfterReplacingTable() throws
Exception {
assertThat(sql("SELECT * FROM %s.metadata_log_entries", tableName))
.containsExactly(firstEntry, secondEntry, thirdEntry, fourthEntry,
fifthEntry);
}
+
+ @TestTemplate
+ public void testReadableMetricsWithUUIDBounds() throws Exception {
+ // 1. Create table via Spark SQL (Spark-compatible)
+ sql(
+ "CREATE TABLE %s (dummy INT) USING iceberg TBLPROPERTIES
('format-version'='%s')",
+ tableName, formatVersion);
+
+ // 2. Load Iceberg table and evolve schema to UUID
+ Table table = Spark3Util.loadIcebergTable(spark, tableName);
+
Review Comment:
nit: remove extra line, copy to all place ? maybe better to add newline just
before comment line ?
##########
spark/v3.4/spark-extensions/src/test/java/org/apache/iceberg/spark/extensions/TestMetadataTables.java:
##########
@@ -990,4 +994,44 @@ public void metadataLogEntriesAfterReplacingTable() throws
Exception {
assertThat(sql("SELECT * FROM %s.metadata_log_entries", tableName))
.containsExactly(firstEntry, secondEntry, thirdEntry, fourthEntry,
fifthEntry);
}
+
+ @TestTemplate
+ public void testReadableMetricsWithUUIDBounds() throws Exception {
+ // 1. Create table via Spark SQL (Spark-compatible)
+ sql(
+ "CREATE TABLE %s (dummy INT) USING iceberg TBLPROPERTIES
('format-version'='%s')",
+ tableName, formatVersion);
+
+ // 2. Load Iceberg table and evolve schema to UUID
+ Table table = Spark3Util.loadIcebergTable(spark, tableName);
+
+ table.updateSchema().deleteColumn("dummy").addColumn("id",
Types.UUIDType.get()).commit();
+
+ // 3. Insert data via Spark (forces Iceberg to write UUID metrics)
+ UUID uuid = UUID.randomUUID();
+
+ sql("INSERT INTO %s VALUES ('%s')", tableName, uuid.toString());
+
+ // 4. Query readable_metrics
+ // BEFORE FIX: crashes with
+ // java.lang.ClassCastException: java.util.UUID → java.lang.CharSequence
+ Dataset<Row> df = spark.sql("SELECT readable_metrics FROM " + tableName +
".all_files");
+
+ // After fix: must not throw
+ List<Row> rows = df.collectAsList();
+ assertThat(rows).hasSize(1);
+
+ Row readableMetrics = rows.get(0).getStruct(0);
+ assertThat(readableMetrics).isNotNull();
Review Comment:
how about asserting value read is correct ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]