Re: [I] Failed to assign splits due to the serialized split size [iceberg]

2024-01-06 Thread via GitHub


javrasya commented on issue #9410:
URL: https://github.com/apache/iceberg/issues/9410#issuecomment-1879638660

   We don't do any deletes actually. I will try to debug it locally somehow on 
that single file it was failing on. I can why it is big. But regardless, what 
do you think would be a remedy in such case? Even though it was delete, should 
we not do deletes because that breaks the downstream? 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Support partitioned writes [iceberg-python]

2024-01-06 Thread via GitHub


Fokko commented on issue #208:
URL: https://github.com/apache/iceberg-python/issues/208#issuecomment-1879726210

   Hey @jqin61 Thanks for replying here. I'm not aware of the fact that anyone 
already started on this. It would be great if you can take a stab at it 🚀 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Parquet: Support reading INT96 column in row group filter [iceberg]

2024-01-06 Thread via GitHub


manuzhang commented on code in PR #8988:
URL: https://github.com/apache/iceberg/pull/8988#discussion_r1443788439


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/source/TestIcebergSourceTablesBase.java:
##
@@ -2181,20 +2181,28 @@ public void testTableWithInt96Timestamp() throws 
IOException {
 stagingLocation);
 
 // validate we get the expected results back
-List expected = 
spark.table("parquet_table").select("tmp_col").collectAsList();
-List actual =
-spark
-.read()
-.format("iceberg")
-.load(loadLocation(tableIdentifier))
-.select("tmp_col")
-.collectAsList();
-assertThat(actual).as("Rows must 
match").containsExactlyInAnyOrderElementsOf(expected);
+testWithFilter("tmp_col < to_timestamp('2000-01-31 08:30:00')", 
tableIdentifier);

Review Comment:
   Fixed now.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Core: Remove deprecated method from BaseMetadataTable [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #9298:
URL: https://github.com/apache/iceberg/pull/9298#discussion_r1443801136


##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java:
##
@@ -948,6 +950,17 @@ public static 
org.apache.spark.sql.catalyst.TableIdentifier toV1TableIdentifier(
 return org.apache.spark.sql.catalyst.TableIdentifier.apply(table, 
database);
   }
 
+  static String baseTableUUID(org.apache.iceberg.Table table) {
+if (table instanceof HasTableOperations) {
+  TableOperations ops = ((HasTableOperations) table).operations();
+  return ops.current().uuid();
+} else if (table instanceof BaseMetadataTable) {
+  return ((BaseMetadataTable) table).table().operations().current().uuid();
+} else {
+  throw new UnsupportedOperationException("Cannot fetch table operations 
for " + table.name());

Review Comment:
   should this be replicated across all Spark versions? Also I would probably 
update the error msg to `Cannot retrieve UUID for table ...`



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Core: Create JUnit5 version of TableTestBase [iceberg]

2024-01-06 Thread via GitHub


lisirrx commented on code in PR #9217:
URL: https://github.com/apache/iceberg/pull/9217#discussion_r1443802733


##
core/src/test/java/org/apache/iceberg/TestManifestReader.java:
##
@@ -32,17 +32,15 @@
 import org.apache.iceberg.types.Types;
 import org.assertj.core.api.Assertions;
 import 
org.assertj.core.api.recursive.comparison.RecursiveComparisonConfiguration;
-import org.junit.Assert;
-import org.junit.Assume;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
-
-@RunWith(Parameterized.class)
-public class TestManifestReader extends TableTestBase {
-  @Parameterized.Parameters(name = "formatVersion = {0}")
-  public static Object[] parameters() {
-return new Object[] {1, 2};
+import org.junit.jupiter.api.Assumptions;
+import org.junit.jupiter.api.TestTemplate;
+import org.junit.jupiter.api.extension.ExtendWith;
+
+@ExtendWith(ParameterizedTestExtension.class)

Review Comment:
   I found that some classes, for example `ScanPlanningAndReportingTestBase` 
and `TestCommitReporting` only use value `2` for formatVersion(in their 
constructor), I'm not sure if it is ok for such classes. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Parquet: Support reading INT96 column in row group filter [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #8988:
URL: https://github.com/apache/iceberg/pull/8988#discussion_r1443806214


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/source/TestIcebergSourceTablesBase.java:
##
@@ -2181,20 +2181,28 @@ public void testTableWithInt96Timestamp() throws 
IOException {
 stagingLocation);
 
 // validate we get the expected results back
-List expected = 
spark.table("parquet_table").select("tmp_col").collectAsList();
-List actual =
-spark
-.read()
-.format("iceberg")
-.load(loadLocation(tableIdentifier))
-.select("tmp_col")
-.collectAsList();
-assertThat(actual).as("Rows must 
match").containsExactlyInAnyOrderElementsOf(expected);
+testWithFilter("tmp_col < to_timestamp('2000-01-31 08:30:00')", 
tableIdentifier);

Review Comment:
   I think it would be good to update the same test across all Spark versions, 
because the code fixes it already across versions



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Spark 3.5: Migrate tests in SQL directory to JUnit5 [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #9401:
URL: https://github.com/apache/iceberg/pull/9401#discussion_r1443812461


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/sql/TestNamespaceSQL.java:
##
@@ -18,90 +18,131 @@
  */
 package org.apache.iceberg.spark.sql;
 
+import static org.assertj.core.api.Assertions.assertThat;
+import static org.assertj.core.api.Assumptions.assumeThat;
+
 import java.io.File;
 import java.util.List;
 import java.util.Map;
 import java.util.Set;
 import java.util.stream.Collectors;
+import org.apache.iceberg.Parameter;
+import org.apache.iceberg.Parameters;
 import org.apache.iceberg.catalog.Namespace;
 import org.apache.iceberg.catalog.TableIdentifier;
 import org.apache.iceberg.exceptions.NamespaceNotEmptyException;
 import org.apache.iceberg.relocated.com.google.common.collect.ImmutableSet;
 import org.apache.iceberg.relocated.com.google.common.collect.Iterables;
-import org.apache.iceberg.spark.SparkCatalogTestBase;
+import org.apache.iceberg.spark.CatalogTestBase;
+import org.apache.iceberg.spark.SparkCatalogConfig;
 import org.assertj.core.api.Assertions;
-import org.junit.After;
-import org.junit.Assert;
-import org.junit.Assume;
-import org.junit.Test;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.TestTemplate;
 
-public class TestNamespaceSQL extends SparkCatalogTestBase {
+public class TestNamespaceSQL extends CatalogTestBase {
   private static final Namespace NS = Namespace.of("db");
 
-  private final String fullNamespace;
-  private final boolean isHadoopCatalog;
-
-  public TestNamespaceSQL(String catalogName, String implementation, 
Map config) {
-super(catalogName, implementation, config);
-this.fullNamespace = ("spark_catalog".equals(catalogName) ? "" : 
catalogName + ".") + NS;
-this.isHadoopCatalog = "testhadoop".equals(catalogName);
+  @Parameter(index = 3)
+  private String fullNamespace;
+
+  @Parameter(index = 4)
+  private boolean isHadoopCatalog;
+
+  @Parameters(
+  name =
+  "catalogName = {0}, implementation = {1}, config = {2}, 
fullNameSpace = {3}, isHadoopCatalog = {4}")
+  protected static Object[][] parameters() {
+return new Object[][] {
+  {
+SparkCatalogConfig.HIVE.catalogName(),
+SparkCatalogConfig.HIVE.implementation(),
+SparkCatalogConfig.HIVE.properties(),
+"testhive." + NS.toString(),

Review Comment:
   we should probably use `SparkCatalogConfig.HIVE.catalogName()` to construct 
this. If anyone ever updates the underlying name then this test will start to 
fail



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Spark 3.5: Migrate tests in SQL directory to JUnit5 [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #9401:
URL: https://github.com/apache/iceberg/pull/9401#discussion_r1443812495


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/sql/TestPartitionedWritesAsSelect.java:
##
@@ -18,34 +18,35 @@
  */
 package org.apache.iceberg.spark.sql;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 import java.util.List;
 import java.util.stream.IntStream;
 import org.apache.iceberg.spark.IcebergSpark;
-import org.apache.iceberg.spark.SparkTestBaseWithCatalog;
+import org.apache.iceberg.spark.TestBaseWithCatalog;
 import org.apache.spark.sql.types.DataTypes;
-import org.junit.After;
-import org.junit.Assert;
-import org.junit.Before;
-import org.junit.Test;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.TestTemplate;
 
-public class TestPartitionedWritesAsSelect extends SparkTestBaseWithCatalog {
+public class TestPartitionedWritesAsSelect extends TestBaseWithCatalog {
 
-  private final String targetTable = tableName("target_table");
+  private final String targetTable = "testhadoop.default.target_table";

Review Comment:
   why this change?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Spark 3.5: Migrate tests in SQL directory to JUnit5 [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #9401:
URL: https://github.com/apache/iceberg/pull/9401#discussion_r1443812981


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/sql/TestTimestampWithoutZone.java:
##
@@ -145,33 +137,31 @@ public void testAppendTimestampWithZone() {
 toTimestamp("2021-02-01T00:00:00.0");
   }
 
-  @Test
+  @TestTemplate
   public void testCreateAsSelectWithTimestampWithoutZone() {
 sql("INSERT INTO %s VALUES %s", tableName, rowToSqlValues(values));
 
 sql("CREATE TABLE %s USING iceberg AS SELECT * FROM %s", newTableName, 
tableName);
 
-Assert.assertEquals(
-"Should have " + values.size() + " row",
-(long) values.size(),
-scalarSql("SELECT count(*) FROM %s", newTableName));
+assertThat(scalarSql("SELECT count(*) FROM %s", newTableName))
+.as("Should have " + values.size() + " row")
+.isEqualTo((long) values.size());

Review Comment:
   do we need these casts?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Core: Create JUnit5 version of TableTestBase [iceberg]

2024-01-06 Thread via GitHub


nastra commented on code in PR #9217:
URL: https://github.com/apache/iceberg/pull/9217#discussion_r1443813325


##
core/src/test/java/org/apache/iceberg/TestManifestReader.java:
##
@@ -32,17 +32,15 @@
 import org.apache.iceberg.types.Types;
 import org.assertj.core.api.Assertions;
 import 
org.assertj.core.api.recursive.comparison.RecursiveComparisonConfiguration;
-import org.junit.Assert;
-import org.junit.Assume;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
-
-@RunWith(Parameterized.class)
-public class TestManifestReader extends TableTestBase {
-  @Parameterized.Parameters(name = "formatVersion = {0}")
-  public static Object[] parameters() {
-return new Object[] {1, 2};
+import org.junit.jupiter.api.Assumptions;
+import org.junit.jupiter.api.TestTemplate;
+import org.junit.jupiter.api.extension.ExtendWith;
+
+@ExtendWith(ParameterizedTestExtension.class)

Review Comment:
   those are still parameterized test, but using only a single version in this 
case



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Flink: Watermark read options [iceberg]

2024-01-06 Thread via GitHub


stevenzwu commented on code in PR #9346:
URL: https://github.com/apache/iceberg/pull/9346#discussion_r1443861169


##
flink/v1.18/flink/src/test/java/org/apache/iceberg/flink/source/TestIcebergSourceSql.java:
##
@@ -39,4 +62,76 @@ public void before() throws IOException {
 .getConfiguration()
 .set(TableConfigOptions.TABLE_DYNAMIC_TABLE_OPTIONS_ENABLED, true);
   }
+
+  private Record generateRecord(Instant t1) {
+Record record = GenericRecord.create(SCHEMA_TS);
+record.setField("t1", t1.atZone(ZoneId.systemDefault()).toLocalDateTime());
+record.setField("t2", t1.getEpochSecond());

Review Comment:
   > If the watermark generation is not set, then the records from the files 
are returned in the same order as we added to the table. If the watermark 
generation is set, then the splits are returned in the order based on the data.
   
   both test methods enable watermark generation. 
   
   > If the data is the same in t1 and in t2 columns, then there is no 
difference when we order by (generate watermarks) based on t1 or t2. So the 
test is less restrictive.
   
   I guess I probably understand why previously t1 and t2 are set to different 
values. you want the order to be different depnding on t1 or t2 is the 
watermark column. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Core: Create JUnit5 version of TableTestBase [iceberg]

2024-01-06 Thread via GitHub


lisirrx commented on code in PR #9217:
URL: https://github.com/apache/iceberg/pull/9217#discussion_r1443873022


##
core/src/test/java/org/apache/iceberg/TestManifestReader.java:
##
@@ -32,17 +32,15 @@
 import org.apache.iceberg.types.Types;
 import org.assertj.core.api.Assertions;
 import 
org.assertj.core.api.recursive.comparison.RecursiveComparisonConfiguration;
-import org.junit.Assert;
-import org.junit.Assume;
-import org.junit.Test;
-import org.junit.runner.RunWith;
-import org.junit.runners.Parameterized;
-
-@RunWith(Parameterized.class)
-public class TestManifestReader extends TableTestBase {
-  @Parameterized.Parameters(name = "formatVersion = {0}")
-  public static Object[] parameters() {
-return new Object[] {1, 2};
+import org.junit.jupiter.api.Assumptions;
+import org.junit.jupiter.api.TestTemplate;
+import org.junit.jupiter.api.extension.ExtendWith;
+
+@ExtendWith(ParameterizedTestExtension.class)

Review Comment:
   Ok, I got it. I have tried to override the `parameters()` with `@Parameters` 
in subclass and it works. 
   You can push the changes to my branch. Thanks for pointing this out!



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Python: Implement Name Mapping to construct iceberg schema when field ids are not present in Data files [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] commented on issue #7451:
URL: https://github.com/apache/iceberg/issues/7451#issuecomment-1879887719

   This issue has been closed because it has not received any activity in the 
last 14 days since being marked as 'stale'


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Python: Implement Name Mapping to construct iceberg schema when field ids are not present in Data files [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] closed issue #7451: Python: Implement Name Mapping to 
construct iceberg schema when field ids are not present in Data files
URL: https://github.com/apache/iceberg/issues/7451


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Iceberg partition date/day equality problem when created on timestamp [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] commented on issue #6853:
URL: https://github.com/apache/iceberg/issues/6853#issuecomment-1879887734

   This issue has been automatically marked as stale because it has been open 
for 180 days with no activity. It will be closed in next 14 days if no further 
activity occurs. To permanently prevent this issue from being considered stale, 
add the label 'not-stale', but commenting on the issue is preferred when 
possible.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Write iceberg Table Using Pyarrow [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] commented on issue #6830:
URL: https://github.com/apache/iceberg/issues/6830#issuecomment-1879887741

   This issue has been closed because it has not received any activity in the 
last 14 days since being marked as 'stale'


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Do Duck Db supports Iceberg table without converting to Pyarrow [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] closed issue #6829: Do Duck Db supports Iceberg table 
without converting to Pyarrow
URL: https://github.com/apache/iceberg/issues/6829


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Do Duck Db supports Iceberg table without converting to Pyarrow [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] commented on issue #6829:
URL: https://github.com/apache/iceberg/issues/6829#issuecomment-1879887748

   This issue has been closed because it has not received any activity in the 
last 14 days since being marked as 'stale'


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [I] Write iceberg Table Using Pyarrow [iceberg]

2024-01-06 Thread via GitHub


github-actions[bot] closed issue #6830: Write iceberg Table Using Pyarrow
URL: https://github.com/apache/iceberg/issues/6830


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



[PR] Build: Bump software.amazon.awssdk:bom from 2.21.42 to 2.22.12 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] opened a new pull request, #9426:
URL: https://github.com/apache/iceberg/pull/9426

   Bumps software.amazon.awssdk:bom from 2.21.42 to 2.22.12.
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=software.amazon.awssdk:bom&package-manager=gradle&previous-version=2.21.42&new-version=2.22.12)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   
   Dependabot commands and options
   
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show  ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



[PR] Build: Bump org.assertj:assertj-core from 3.24.2 to 3.25.1 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] opened a new pull request, #9427:
URL: https://github.com/apache/iceberg/pull/9427

   Bumps [org.assertj:assertj-core](https://github.com/assertj/assertj) from 
3.24.2 to 3.25.1.
   
   Release notes
   Sourced from https://github.com/assertj/assertj/releases";>org.assertj:assertj-core's 
releases.
   
   v3.25.1
   :jigsaw: Binary Compatibility
   The release is:
   
   Binary compatible with the previous minor version.
   Binary incompatible with the previous patch version.
   
   :bug: Bug Fixes
   Core
   
   Revert "Provide value when assertThatThrownBy/thenThrownBy 
fail" https://redirect.github.com/assertj/assertj/issues/3318";>#3318
   Revert "fix: containsExactly does not work properly with maps not 
using equals to compare keys" https://redirect.github.com/assertj/assertj/issues/3321";>#3321
   
   v3.25.0
   :jigsaw: Binary Compatibility
   The release is https://assertj.github.io/doc/#binary-compatibility";>binary 
compatible with the previous minor version.
   :no_entry_sign: Deprecated
   Core
   
   Deprecate the following date/time related assertions in favor of 
isCloseTo:
   
   isEqualToIgnoringHours
   isEqualToIgnoringMinutes
   isEqualToIgnoringSeconds
   isEqualToIgnoringMillis
   isEqualToIgnoringNanos
   isInSameHourAs
   isInSameMinuteAs
   isInSameSecondAs
   
   
   Deprecate asList in favor of asInstanceOf https://redirect.github.com/assertj/assertj/issues/3138";>#3138
   
   :sparkles: New Features
   Core
   
   Add Descriptable#describedAs(Supplier)
   Add isInThePast and isInTheFuture to 
LocalDate assertions https://redirect.github.com/assertj/assertj/issues/2933";>#2933
   Add isInThePast and isInTheFuture to the 
missing Java 8 date/time types https://redirect.github.com/assertj/assertj/issues/2947";>#2947
   Add isRecord and isNotRecord to 
Class assertions https://redirect.github.com/assertj/assertj/issues/2968";>#2968
   Add hasNullValue and doesNotHaveNullValue to 
AtomicReferenceAssert https://redirect.github.com/assertj/assertj/issues/2969";>#2969
   Add asBoolean|Byte|Short|Int|Long|Float|Double to 
String assertions https://redirect.github.com/assertj/assertj/issues/2580";>#2580
   Add hasRecordComponents to Class assertions https://redirect.github.com/assertj/assertj/issues/2995";>#2995
   Add getters for field path in ComparisonDifference https://redirect.github.com/assertj/assertj/issues/3007";>#3007
   Allow to compare enum and string fields in the 
recursive comparison https://redirect.github.com/assertj/assertj/issues/2616";>#2616
   Provide value when assertThatThrownBy / 
thenThrownBy fail https://redirect.github.com/assertj/assertj/issues/3043";>#3043
   Add isSealed and isNotSealed to 
Class assertions https://redirect.github.com/assertj/assertj/issues/3080";>#3080
   Add assertThatCharSequence to disambiguate Groovy's 
GString https://redirect.github.com/assertj/assertj/issues/3132";>#3132
   
   
   
   ... (truncated)
   
   
   Commits
   
   https://github.com/assertj/assertj/commit/65f6433d26f74795434ac2ef2118cd5cec7c1be4";>65f6433
 [maven-release-plugin] prepare release assertj-build-3.25.1
   https://github.com/assertj/assertj/commit/4753165446ddec5e25250e52e6e5582093793262";>4753165
 Revert "fix: containsExactly does not work properly with maps not using 
equal...
   https://github.com/assertj/assertj/commit/2a7c5a6e205e29d42e9886657987893cb03b6875";>2a7c5a6
 Revert "Provide value when assertThatThrownBy/thenThrownBy fail" (https://redirect.github.com/assertj/assertj/issues/3318";>#3318)
   https://github.com/assertj/assertj/commit/887f97b65da068e03308a6373a0c8a76912377aa";>887f97b
 Update license headers
   https://github.com/assertj/assertj/commit/25347d57bb28295548ec6bd5b06e70b6d03496c9";>25347d5
 [maven-release-plugin] prepare for next development iteration
   https://github.com/assertj/assertj/commit/2c1c0839af66989c2d30d5b556799ca75ef2c246";>2c1c083
 [maven-release-plugin] prepare release assertj-build-3.25.0
   https://github.com/assertj/assertj/commit/c44129de725f5a3ecbaba16dd7b9ec31fd9e293b";>c44129d
 Move flatten-maven-plugin version to separate property
   https://github.com/assertj/assertj/commit/31cefaf68c04e13c6703033151f776251782af85";>31cefaf
 Apply flatten-maven-plugin to assertj-core and 
assertj-guava (https://redirect.github.com/assertj/assertj/issues/3311";>#3311)
   https://github.com/assertj/assertj/commit/435d183f2ededb18246338fb780098fabfba700b";>435d183
 chore(deps-dev): bump org.testng:testng from 7.8.0 to 7.9.0 (https://redirect.github.com/assertj/assertj/issues/3312";>#3312)
   https://github.com/assertj/assertj/commit/e044604d99cdb513177d467886d51c47e0163251";>e044604
 chore(deps-dev): bump nl.jqno.equalsverifier:equalsverifier from 3.15.4 to 
3
   Additional commits viewable in https://github.com/assertj/assertj/compare/assertj-build-3.24.2...assertj-build-3.25.1";>compare
 view
   
   
   
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/

Re: [PR] Build: Bump software.amazon.awssdk:bom from 2.21.42 to 2.22.9 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] closed pull request #9391: Build: Bump 
software.amazon.awssdk:bom from 2.21.42 to 2.22.9
URL: https://github.com/apache/iceberg/pull/9391


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Build: Bump software.amazon.awssdk:bom from 2.21.42 to 2.22.9 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] commented on PR #9391:
URL: https://github.com/apache/iceberg/pull/9391#issuecomment-1879931808

   Superseded by #9426.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



[PR] Build: Bump com.palantir.baseline:gradle-baseline-java from 4.42.0 to 5.32.0 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] opened a new pull request, #9428:
URL: https://github.com/apache/iceberg/pull/9428

   Bumps 
[com.palantir.baseline:gradle-baseline-java](https://github.com/palantir/gradle-baseline)
 from 4.42.0 to 5.32.0.
   
   Release notes
   Sourced from https://github.com/palantir/gradle-baseline/releases";>com.palantir.baseline:gradle-baseline-java's
 releases.
   
   5.32.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Improvement
   Warn when using records with array fields as their equality behavior is 
surprising
   https://redirect.github.com/palantir/gradle-baseline/pull/2672";>palantir/gradle-baseline#2672
   
   
   Improvement
   Add Visitor to the list of 
DO_NOT_IMPORT_INNER
   https://redirect.github.com/palantir/gradle-baseline/pull/2677";>palantir/gradle-baseline#2677
   
   
   Fix
   Add missing space to message in BaselineCircleCi.java
   https://redirect.github.com/palantir/gradle-baseline/pull/2692";>palantir/gradle-baseline#2692
   
   
   
   5.31.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Improvement
   Variables annotated with @RegisterExtension are now exempt 
from the StrictUnusedVariable check.
   https://redirect.github.com/palantir/gradle-baseline/pull/2664";>palantir/gradle-baseline#2664
   
   
   
   5.30.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Improvement
   Upgrade error-prone to the latest release
   https://redirect.github.com/palantir/gradle-baseline/pull/2651";>palantir/gradle-baseline#2651
   
   
   
   5.29.0
   Automated release, no documented user facing changes
   5.28.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   Ensure that the baseline-exact-dependencies plugin does not eagerly 
force the creation of the compileClasspath configuration, which 
causes "'languageVersion' is final and cannot be changed any 
further." errors in Gradle >=8.3.
   https://redirect.github.com/palantir/gradle-baseline/pull/2645";>palantir/gradle-baseline#2645
   
   
   
   5.27.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   baseline-exact-dependencies is now far more lazy around 
Configuration creation in order to support Gradle 8. Fixed attempt 
to make this work that should not break --write-locks.
   https://redirect.github.com/palantir/gradle-baseline/pull/2644";>palantir/gradle-baseline#2644
   
   
   
   5.26.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   baseline-java-versions avoids 
ClassCastException when using custom JDKs under Gradle 8.
   https://redirect.github.com/palantir/gradle-baseline/pull/2641";>palantir/gradle-baseline#2641
   
   
   
   5.25.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   Revert "Lazy exact dependencies"
   https://redirect.github.com/palantir/gradle-baseline/pull/2642";>palantir/gradle-baseline#2642
   
   
   
   5.24.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   baseline-exact-dependencies is now far more lazy around 
Configuration creation in order to support Gradle 8.
   https://redirect.github.com/palantir/gradle-baseline/pull/2639";>palantir/gradle-baseline#2639
   
   
   
   5.23.0
   
   
   
   Type
   Description
   Link
   
   
   
   
   Fix
   Use a Proxy for JavaInstallationMetadata so we 
can work across Gradle 7 and 8.
   https://redirect.github.com/palantir/gradle-baseline/pull/2605";>palantir/gradle-baseline#2605
   
   
   
   
   
   ... (truncated)
   
   
   Commits
   
   https://github.com/palantir/gradle-baseline/commit/9ba525cbf56ea94848d8fff26a6c76da9cb2e8ef";>9ba525c
 Autorelease 5.32.0
   https://github.com/palantir/gradle-baseline/commit/f03f26270fd37ea142e7fc55b4e2e003727a5399";>f03f262
 Add Visitor to the list of DO_NOT_IMPORT_INNER for 
intellij formatting (#...
   https://github.com/palantir/gradle-baseline/commit/ed824e335aa80d4dd12f99e95cb8ba0ee9c290c5";>ed824e3
 Add missing space to message in BaselineCircleCi.java (https://redirect.github.com/palantir/gradle-baseline/issues/2692";>#2692)
   https://github.com/palantir/gradle-baseline/commit/f7d38c6dedaaf317cb29dc8a60ab19f8b2921375";>f7d38c6
 Excavator:  Upgrade Jackson to the latest stable release (https://redirect.github.com/palantir/gradle-baseline/issues/2691";>#2691)
   https://github.com/palantir/gradle-baseline/commit/e5bbec99042aa43fbd1710ddea8f00199bf0";>e5bbec9
 Excavator:  Upgrade Jackson to the latest stable release (https://redirect.github.com/palantir/gradle-baseline/issues/2690";>#2690)
   https://github.com/palantir/gradle-baseline/commit/39630ef680348a07730d36204d50afe7dd92512e";>39630ef
 Excavator:  Upgrade Jackson to the latest stable release (https://redirect.github.com/palantir/gradle-baseline/issues/2689";>#2689)
   https://github.com/palantir/gradle-baseline/commit/3e5922542c0f278f2cdd5f4e32f28a144a6d9987";>3e59225
 Excavator:  Render CircleCI file using template specified in 
.circleci/templa...
   https://github.com/palantir/gradle-baseline/

[PR] Build: Bump com.google.errorprone:error_prone_annotations from 2.24.0 to 2.24.1 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] opened a new pull request, #9429:
URL: https://github.com/apache/iceberg/pull/9429

   Bumps 
[com.google.errorprone:error_prone_annotations](https://github.com/google/error-prone)
 from 2.24.0 to 2.24.1.
   
   Release notes
   Sourced from https://github.com/google/error-prone/releases";>com.google.errorprone:error_prone_annotations's
 releases.
   
   Error Prone 2.24.1
   Changes:
   
   Add an assertion to try to help debug https://redirect.github.com/google/error-prone/issues/4225";>google/error-prone#4225
   
   Full Changelog: https://github.com/google/error-prone/compare/v2.24.0...v2.24.1";>https://github.com/google/error-prone/compare/v2.24.0...v2.24.1
   
   
   
   Commits
   
   https://github.com/google/error-prone/commit/ecf7e103b265b3f51ce8e317fe9ddf4e9477c98e";>ecf7e10
 Release Error Prone 2.24.1
   https://github.com/google/error-prone/commit/2bd7859957ebf205f1c73f9d38d9f85338b80773";>2bd7859
 Add an assertion to try to help debug https://github.com/google/error-prone/i";>https://github.com/google/error-prone/i...
   https://github.com/google/error-prone/commit/58a9e8082b2344f8fb2c3814112b46104708bbab";>58a9e80
 Update a few checks (and a class of tests, with AbstractToString) to handle 
#...
   https://github.com/google/error-prone/commit/fd21bc9e4e9737b86fdd792bf5e71f588aee5c27";>fd21bc9
 Reflow a comment that didn't appreciate being formatted in unknown commit
   https://github.com/google/error-prone/commit/63cf19274e382089693102a19bf31d7dbe791807";>63cf192
 Update CanIgnoreReturnValueSuggester summary.
   https://github.com/google/error-prone/commit/5fa727a5e4e6a368a92c6f36224dbe9716ed5659";>5fa727a
 Actually test that hasExtraParameterForEnclosingInstance 
works.
   See full diff in https://github.com/google/error-prone/compare/v2.24.0...v2.24.1";>compare 
view
   
   
   
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=com.google.errorprone:error_prone_annotations&package-manager=gradle&previous-version=2.24.0&new-version=2.24.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   
   Dependabot commands and options
   
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show  ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



[PR] Build: Bump mkdocs-monorepo-plugin from 1.0.5 to 1.1.0 [iceberg]

2024-01-06 Thread via GitHub


dependabot[bot] opened a new pull request, #9430:
URL: https://github.com/apache/iceberg/pull/9430

   Bumps 
[mkdocs-monorepo-plugin](https://github.com/backstage/mkdocs-monorepo-plugin) 
from 1.0.5 to 1.1.0.
   
   Changelog
   Sourced from https://github.com/backstage/mkdocs-monorepo-plugin/blob/master/docs/CHANGELOG.md";>mkdocs-monorepo-plugin's
 changelog.
   
   1.1.0
   
   Dropped official support for Python 3.7
   Added official support for Python 3.12
   Replaced deprecated package distutils https://redirect.github.com/backstage/mkdocs-monorepo-plugin/pull/118";>#118
 courtesy of https://github.com/PauloASilva";>@​PauloASilva and https://github.com/piotr1212";>@​piotr1212
   
   1.0.4
   
   Resolve a bug that prevented this plugin from working with mkdocs >= 
v1.4.0
   
   1.0.3
   
   Allow no specification for both ´nav´ and ´docs_dir´
   
   1.0.2
   
   Fixed edit URLs to use realpath of pages
   
   1.0.1
   
   Fixed edit URLs for included pages
   
   1.0.0
   
   This package has been promoted to v1.0!
   
   
   Note: Going forward, this package will follow https://semver.org/#semantic-versioning-200";>semver conventions.
   
   0.5.3
   
   Don't run on_serve if on_config was skipped
   
   0.5.2
   
   Add support for wildcard include statement
   
   0.5.1
   
   Fix No module named 'slugify' error when installing in 
mkdocs environments.
   
   0.5.0
   
   Allow mkdocs.yaml in addition to 
mkdocs.yml
   Drops support for Python 3.5 (because the minimum mkdocs 
version which
   supports the above feature no longer supports it).
   
   0.4.18
   
   Allow inclusion of sub-docs mkdocs.yml even if its name 
isn't URL-friendly.
   Works by slugifying non-URL friendly names. (https://redirect.github.com/backstage/mkdocs-monorepo-plugin/issues/58";>#58)
   
   
   
   ... (truncated)
   
   
   Commits
   
   See full diff in https://github.com/backstage/mkdocs-monorepo-plugin/commits";>compare 
view
   
   
   
   
   
   [![Dependabot compatibility 
score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=mkdocs-monorepo-plugin&package-manager=pip&previous-version=1.0.5&new-version=1.1.0)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores)
   
   Dependabot will resolve any conflicts with this PR as long as you don't 
alter it yourself. You can also trigger a rebase manually by commenting 
`@dependabot rebase`.
   
   [//]: # (dependabot-automerge-start)
   [//]: # (dependabot-automerge-end)
   
   ---
   
   
   Dependabot commands and options
   
   
   You can trigger Dependabot actions by commenting on this PR:
   - `@dependabot rebase` will rebase this PR
   - `@dependabot recreate` will recreate this PR, overwriting any edits that 
have been made to it
   - `@dependabot merge` will merge this PR after your CI passes on it
   - `@dependabot squash and merge` will squash and merge this PR after your CI 
passes on it
   - `@dependabot cancel merge` will cancel a previously requested merge and 
block automerging
   - `@dependabot reopen` will reopen this PR if it is closed
   - `@dependabot close` will close this PR and stop Dependabot recreating it. 
You can achieve the same result by closing it manually
   - `@dependabot show  ignore conditions` will show all of 
the ignore conditions of the specified dependency
   - `@dependabot ignore this major version` will close this PR and stop 
Dependabot creating any more for this major version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this minor version` will close this PR and stop 
Dependabot creating any more for this minor version (unless you reopen the PR 
or upgrade to it yourself)
   - `@dependabot ignore this dependency` will close this PR and stop 
Dependabot creating any more for this dependency (unless you reopen the PR or 
upgrade to it yourself)
   
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Spark 3.5: Migrate tests in SQL directory to JUnit5 [iceberg]

2024-01-06 Thread via GitHub


chinmay-bhat commented on code in PR #9401:
URL: https://github.com/apache/iceberg/pull/9401#discussion_r1443954928


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/sql/TestPartitionedWritesAsSelect.java:
##
@@ -18,34 +18,35 @@
  */
 package org.apache.iceberg.spark.sql;
 
+import static org.assertj.core.api.Assertions.assertThat;
+
 import java.util.List;
 import java.util.stream.IntStream;
 import org.apache.iceberg.spark.IcebergSpark;
-import org.apache.iceberg.spark.SparkTestBaseWithCatalog;
+import org.apache.iceberg.spark.TestBaseWithCatalog;
 import org.apache.spark.sql.types.DataTypes;
-import org.junit.After;
-import org.junit.Assert;
-import org.junit.Before;
-import org.junit.Test;
+import org.junit.jupiter.api.AfterEach;
+import org.junit.jupiter.api.BeforeEach;
+import org.junit.jupiter.api.TestTemplate;
 
-public class TestPartitionedWritesAsSelect extends SparkTestBaseWithCatalog {
+public class TestPartitionedWritesAsSelect extends TestBaseWithCatalog {
 
-  private final String targetTable = tableName("target_table");
+  private final String targetTable = "testhadoop.default.target_table";

Review Comment:
   aah this was unnecessary, thanks for catching this. Will revert.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org



Re: [PR] Spark 3.5: Migrate tests in SQL directory to JUnit5 [iceberg]

2024-01-06 Thread via GitHub


chinmay-bhat commented on code in PR #9401:
URL: https://github.com/apache/iceberg/pull/9401#discussion_r1443955073


##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/sql/TestTimestampWithoutZone.java:
##
@@ -145,33 +137,31 @@ public void testAppendTimestampWithZone() {
 toTimestamp("2021-02-01T00:00:00.0");
   }
 
-  @Test
+  @TestTemplate
   public void testCreateAsSelectWithTimestampWithoutZone() {
 sql("INSERT INTO %s VALUES %s", tableName, rowToSqlValues(values));
 
 sql("CREATE TABLE %s USING iceberg AS SELECT * FROM %s", newTableName, 
tableName);
 
-Assert.assertEquals(
-"Should have " + values.size() + " row",
-(long) values.size(),
-scalarSql("SELECT count(*) FROM %s", newTableName));
+assertThat(scalarSql("SELECT count(*) FROM %s", newTableName))
+.as("Should have " + values.size() + " row")
+.isEqualTo((long) values.size());

Review Comment:
   no, not needed. Will update this too.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


-
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org