JanKaul commented on issue #10043:
URL: https://github.com/apache/iceberg/issues/10043#issuecomment-2861975925
Not-stale
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
tomtongue opened a new pull request, #13007:
URL: https://github.com/apache/iceberg/pull/13007
*Migrate Spark 3.4 tests based on JUnit 4 to Junit5 with AssertJ style. This
is related to https://github.com/apache/iceberg/issues/7160*
This PR migrates the following `SparkCatalogTestBase
gyfora commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078986042
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecordInternal.java:
##
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Founda
gyfora commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078986042
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecordInternal.java:
##
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Founda
huaxingao commented on PR #13006:
URL: https://github.com/apache/iceberg/pull/13006#issuecomment-2861923792
cc @amogh-jahagirdar @pan3793 @szehon-ho
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
huaxingao opened a new pull request, #13006:
URL: https://github.com/apache/iceberg/pull/13006
Reverts apache/iceberg#12494
Sorry I forgot to squash the commits, the change history will be lost. I
will revert this PR and redo.
--
This is an automated message from the Apache Git Service
gyfora commented on PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#issuecomment-2861920417
> Thanks for the review @gyfora! I think it makes sense to rename the
API-facing fields / getters / setters to avoid confusion for users.
I am not yet aware of all the conventions
huaxingao commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861917570
I will revert the PR for now
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specif
mxm commented on PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#issuecomment-2861897082
Thanks for the review @gyfora! I think it makes sense to rename the
API-facing fields / getters / setters to avoid confusion for users.
--
This is an automated message from the Apache Gi
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078956264
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecord.java:
##
@@ -0,0 +1,130 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF)
pan3793 commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861869770
Congrats on merging to trunk, but alright, the commit history is lost
eventually
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078954528
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecord.java:
##
@@ -0,0 +1,130 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF)
huaxingao commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861885739
Sorry I forgot to squash the commits. I will fix this
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078955200
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecord.java:
##
@@ -0,0 +1,130 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF)
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078954528
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecord.java:
##
@@ -0,0 +1,130 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF)
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078953365
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecordInternal.java:
##
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundatio
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078954029
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecord.java:
##
@@ -0,0 +1,130 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF)
JeonDaehong commented on issue #12937:
URL: https://github.com/apache/iceberg/issues/12937#issuecomment-2861877956
> @JeonDaehong thanks for helping this, as @nastra is mentioning, I will be
able to work for Flink migration, but maybe I'll take a little bit time for
starting the Flink migra
mxm commented on code in PR #12996:
URL: https://github.com/apache/iceberg/pull/12996#discussion_r2078952111
##
flink/v2.0/flink/src/main/java/org/apache/iceberg/flink/sink/dynamic/DynamicRecordInternal.java:
##
@@ -0,0 +1,165 @@
+/*
+ * Licensed to the Apache Software Foundatio
ajantha-bhat commented on PR #12946:
URL: https://github.com/apache/iceberg/pull/12946#issuecomment-2861876754
@pvary and @gaborkaszab: Anymore comments for this? Thanks for the review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
nastra commented on code in PR #12990:
URL: https://github.com/apache/iceberg/pull/12990#discussion_r2078944246
##
core/src/main/java/org/apache/iceberg/metrics/CommitReportParser.java:
##
@@ -26,6 +26,7 @@
public class CommitReportParser {
private static final String TABL
tomtongue commented on issue #12937:
URL: https://github.com/apache/iceberg/issues/12937#issuecomment-2861856077
@JeonDaehong thanks for helping this, as @nastra is mentioning, I will be
able to work for Flink migration, but maybe I'll take a little bit time for
starting the Flink migration
hariuserx commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2078932630
##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java:
##
@@ -946,7 +946,7 @@ public static List getPartitions(
Object
hariuserx commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2078932898
##
api/src/main/java/org/apache/iceberg/types/Conversions.java:
##
@@ -37,7 +37,7 @@ public class Conversions {
private Conversions() {}
- private static fi
tomtongue commented on PR #12998:
URL: https://github.com/apache/iceberg/pull/12998#issuecomment-2861827499
@nastra Thanks for the review!!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spe
nastra merged PR #13003:
URL: https://github.com/apache/iceberg/pull/13003
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg.ap
nastra merged PR #12998:
URL: https://github.com/apache/iceberg/pull/12998
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg.ap
huaxingao commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861818506
Thank you all! @amogh-jahagirdar @RussellSpitzer @pan3793 @szehon-ho
@aihuaxu
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
CTTY opened a new pull request, #1300:
URL: https://github.com/apache/iceberg-rust/pull/1300
## Which issue does this PR close?
- Closes #1156
## What changes are included in this PR?
- Added Glue catalog support for iceberg-playground
- Added `try_new_with_schema
amogh-jahagirdar merged PR #12494:
URL: https://github.com/apache/iceberg/pull/12494
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@
amogh-jahagirdar commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861800160
Ok, thank you @huaxingao for your dilligence on keeping up with this PR,
I'll go ahead and merge and we can iterate on any follow ups. When the official
spark 4.0 release is pe
manuzhang commented on issue #10745:
URL: https://github.com/apache/iceberg/issues/10745#issuecomment-2861793463
@thjaeckle Feel free to open a PR (or draft PR) if you've got it working.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on t
amogh-jahagirdar commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2078868929
##
api/src/main/java/org/apache/iceberg/types/Conversions.java:
##
@@ -37,7 +37,7 @@ public class Conversions {
private Conversions() {}
- private st
amogh-jahagirdar commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2078868929
##
api/src/main/java/org/apache/iceberg/types/Conversions.java:
##
@@ -37,7 +37,7 @@ public class Conversions {
private Conversions() {}
- private st
lliangyu-lin commented on PR #12969:
URL: https://github.com/apache/iceberg/pull/12969#issuecomment-2861720957
Hi @amogh-jahagirdar @nastra, could you help take a look on the change, if
this is something we should fix?
--
This is an automated message from the Apache Git Service.
To respon
hariuserx commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2078864644
##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java:
##
@@ -946,7 +946,7 @@ public static List getPartitions(
Object
liziyan-lzy commented on code in PR #12254:
URL: https://github.com/apache/iceberg/pull/12254#discussion_r2078864487
##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/actions/DeleteOrphanFilesSparkAction.java:
##
@@ -302,21 +303,29 @@ private Dataset actualFileIdentDS()
liziyan-lzy commented on code in PR #12254:
URL: https://github.com/apache/iceberg/pull/12254#discussion_r2078863790
##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/actions/DeleteOrphanFilesSparkAction.java:
##
@@ -302,21 +303,29 @@ private Dataset actualFileIdentDS()
Marcus-Rosti commented on issue #5970:
URL: https://github.com/apache/iceberg/issues/5970#issuecomment-2861712946
For posterity as well, I ran into this issue because I had both:
```scala
val icebergDependencies: Seq[ModuleID] = Seq(
"org.apache.iceberg" % "iceberg-core" % Vers
bennychow commented on code in PR #11041:
URL: https://github.com/apache/iceberg/pull/11041#discussion_r2078825706
##
format/view-spec.md:
##
@@ -160,6 +179,57 @@ Each entry in `version-log` is a struct with the following
fields:
| _required_ | `timestamp-ms` | Timestamp when
huaxingao commented on PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#issuecomment-2861662035
@amogh-jahagirdar I have removed the changes from
https://github.com/apache/iceberg/pull/12736/files for Spark 4.0. I have
checked all the recent commits after my yesterday's rebase,
singhpk234 opened a new issue, #13005:
URL: https://github.com/apache/iceberg/issues/13005
## About
Presently there is no IRC client implementation supporting Scan Planning.
This is kind of a pre-req for the IRC server to support Scan planning.
There was effort in past to get t
bennychow commented on code in PR #11041:
URL: https://github.com/apache/iceberg/pull/11041#discussion_r2078827742
##
format/view-spec.md:
##
@@ -42,12 +42,28 @@ An atomic swap of one view metadata file for another
provides the basis for maki
Writers create view metadata fil
singhpk234 commented on PR #11180:
URL: https://github.com/apache/iceberg/pull/11180#issuecomment-2861642757
superceded by https://github.com/apache/iceberg/pull/11180
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use th
singhpk234 opened a new pull request, #13004:
URL: https://github.com/apache/iceberg/pull/13004
### About the change
Adding the draft pr for resuming the work for scan planning (previous pr :
https://github.com/apache/iceberg/pull/11180), presently since the client is
not ready it c
bennychow commented on code in PR #11041:
URL: https://github.com/apache/iceberg/pull/11041#discussion_r2078827742
##
format/view-spec.md:
##
@@ -42,12 +42,28 @@ An atomic swap of one view metadata file for another
provides the basis for maki
Writers create view metadata fil
sungwy commented on issue #1980:
URL:
https://github.com/apache/iceberg-python/issues/1980#issuecomment-2861523219
Hi @rongxinyu - thanks for reporting this issue!
I think this is the limitation of the current implementation that makes use
of the `pyarrow.compute.days_between` functi
sungwy opened a new pull request, #1981:
URL: https://github.com/apache/iceberg-python/pull/1981
# Rationale for this change
Replace existing Auth handling with LegacyOAuth2AuthManager. Tracking issue:
https://github.com/apache/iceberg-python/issues/1909
The
jonathanc-n commented on code in PR #1297:
URL: https://github.com/apache/iceberg-rust/pull/1297#discussion_r2078792467
##
crates/integrations/datafusion/src/table/mod.rs:
##
@@ -130,8 +131,19 @@ impl TableProvider for IcebergTableProvider {
filters: &[Expr],
_
phillipleblanc commented on code in PR #1297:
URL: https://github.com/apache/iceberg-rust/pull/1297#discussion_r2078747795
##
crates/integrations/datafusion/src/table/mod.rs:
##
@@ -130,8 +131,19 @@ impl TableProvider for IcebergTableProvider {
filters: &[Expr],
phillipleblanc commented on code in PR #1297:
URL: https://github.com/apache/iceberg-rust/pull/1297#discussion_r2078747477
##
crates/catalog/memory/src/catalog.rs:
##
@@ -53,6 +53,19 @@ impl MemoryCatalog {
warehouse_location,
}
}
+
+/// Register a
github-actions[bot] closed issue #11391: REST API path resolution is not
general enough when using rest catalog
URL: https://github.com/apache/iceberg/issues/11391
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL a
github-actions[bot] commented on issue #11391:
URL: https://github.com/apache/iceberg/issues/11391#issuecomment-2860900050
This issue has been closed because it has not received any activity in the
last 14 days since being marked as 'stale'
--
This is an automated message from the Apache
huaxingao opened a new pull request, #12494:
URL: https://github.com/apache/iceberg/pull/12494
(no comment)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe,
huaxingao closed pull request #12494: Spark 4.0 integration
URL: https://github.com/apache/iceberg/pull/12494
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-
github-actions[bot] commented on issue #11483:
URL: https://github.com/apache/iceberg/issues/11483#issuecomment-2860900109
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] commented on PR #12691:
URL: https://github.com/apache/iceberg/pull/12691#issuecomment-2860900277
This pull request has been marked as stale due to 30 days of inactivity. It
will be closed in 1 week if no further activity occurs. If you think that’s
incorrect or this pul
github-actions[bot] commented on issue #10305:
URL: https://github.com/apache/iceberg/issues/10305#issuecomment-2860899924
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] commented on PR #12530:
URL: https://github.com/apache/iceberg/pull/12530#issuecomment-2860900205
This pull request has been marked as stale due to 30 days of inactivity. It
will be closed in 1 week if no further activity occurs. If you think that’s
incorrect or this pul
github-actions[bot] commented on issue #10618:
URL: https://github.com/apache/iceberg/issues/10618#issuecomment-2860899974
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] commented on issue #10306:
URL: https://github.com/apache/iceberg/issues/10306#issuecomment-2860899951
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] closed issue #9898: Parquet bloom filter doesn't work with
nested fields
URL: https://github.com/apache/iceberg/issues/9898
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
github-actions[bot] commented on issue #10302:
URL: https://github.com/apache/iceberg/issues/10302#issuecomment-2860899897
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] commented on issue #10043:
URL: https://github.com/apache/iceberg/issues/10043#issuecomment-2860899860
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occur
github-actions[bot] commented on issue #9898:
URL: https://github.com/apache/iceberg/issues/9898#issuecomment-2860899841
This issue has been closed because it has not received any activity in the
last 14 days since being marked as 'stale'
--
This is an automated message from the Apache Gi
lliangyu-lin opened a new pull request, #426:
URL: https://github.com/apache/iceberg-go/pull/426
### Description
* This make sure that catalog doesn't support hierarchical namespaces will
be able to run the list tables correctly.
* This change also aligned with the current behavior in
lliangyu-lin opened a new issue, #425:
URL: https://github.com/apache/iceberg-go/issues/425
### Apache Iceberg version
None
### Please describe the bug 🐞
### Description
When running the list command with a specific namespace (e.g., `go run .
list my_db --catalog glue
amogh-jahagirdar commented on code in PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#discussion_r2078512273
##
spark/v4.0/spark-extensions/src/main/scala/org/apache/iceberg/spark/extensions/IcebergSparkSessionExtensions.scala:
##
@@ -0,0 +1,55 @@
+/*
+ * Licensed t
coderfender commented on issue #12936:
URL: https://github.com/apache/iceberg/issues/12936#issuecomment-2860593719
Started working on this
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spe
coderfender commented on code in PR #12824:
URL: https://github.com/apache/iceberg/pull/12824#discussion_r2078576476
##
core/src/test/java/org/apache/iceberg/actions/TestBinPackRewriteFilePlanner.java:
##
@@ -290,7 +290,8 @@ void testValidOptions() {
BinPackRewr
gsoundar commented on issue #1236:
URL: https://github.com/apache/iceberg-rust/issues/1236#issuecomment-2860586306
@xxchan How is your change coming along ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abov
coderfender commented on code in PR #12824:
URL: https://github.com/apache/iceberg/pull/12824#discussion_r2078576961
##
spark/v3.5/spark/src/test/java/org/apache/iceberg/spark/actions/TestRewriteDataFilesAction.java:
##
@@ -1888,6 +1888,88 @@ public void testZOrderRewriteWithSpe
coderfender commented on code in PR #12824:
URL: https://github.com/apache/iceberg/pull/12824#discussion_r2078575561
##
core/src/main/java/org/apache/iceberg/actions/BinPackRewriteFilePlanner.java:
##
@@ -199,30 +214,48 @@ protected long defaultTargetFileSize() {
public FileR
dramaticlly commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078357142
##
core/src/test/java/org/apache/iceberg/hadoop/TestHadoopCatalog.java:
##
@@ -676,6 +677,27 @@ public void testRegisterExistingTable() throws IOException
{
coderfender commented on code in PR #12824:
URL: https://github.com/apache/iceberg/pull/12824#discussion_r2078570233
##
core/src/main/java/org/apache/iceberg/actions/BinPackRewriteFilePlanner.java:
##
@@ -19,11 +19,13 @@
package org.apache.iceberg.actions;
import java.io.IOE
coderfender commented on code in PR #12824:
URL: https://github.com/apache/iceberg/pull/12824#discussion_r2078569878
##
core/src/main/java/org/apache/iceberg/actions/BinPackRewriteFilePlanner.java:
##
@@ -199,30 +214,48 @@ protected long defaultTargetFileSize() {
public FileR
soumya-ghosh commented on code in PR #1626:
URL: https://github.com/apache/iceberg-python/pull/1626#discussion_r2078557847
##
pyiceberg/table/inspect.py:
##
@@ -657,3 +671,30 @@ def all_manifests(self) -> "pa.Table":
lambda args: self._generate_manifests_table(*args
amogh-jahagirdar commented on code in PR #12886:
URL: https://github.com/apache/iceberg/pull/12886#discussion_r2076728051
##
spark/v3.5/spark/src/main/java/org/apache/iceberg/spark/Spark3Util.java:
##
@@ -946,7 +946,7 @@ public static List getPartitions(
JeonDaehong commented on issue #12937:
URL: https://github.com/apache/iceberg/issues/12937#issuecomment-2860451385
> @nastra I can work for this to remove JUnit 4 along with the Spark
migration. But if anyone is already working on this or will work, I could
collaborate the migration for thi
amogh-jahagirdar commented on code in PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#discussion_r2078512273
##
spark/v4.0/spark-extensions/src/main/scala/org/apache/iceberg/spark/extensions/IcebergSparkSessionExtensions.scala:
##
@@ -0,0 +1,55 @@
+/*
+ * Licensed t
amogh-jahagirdar commented on code in PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#discussion_r2078511853
##
spark/v4.0/spark-extensions/src/main/scala/org/apache/iceberg/spark/extensions/IcebergSparkSessionExtensions.scala:
##
@@ -0,0 +1,55 @@
+/*
+ * Licensed t
amogh-jahagirdar commented on code in PR #12494:
URL: https://github.com/apache/iceberg/pull/12494#discussion_r2078511853
##
spark/v4.0/spark-extensions/src/main/scala/org/apache/iceberg/spark/extensions/IcebergSparkSessionExtensions.scala:
##
@@ -0,0 +1,55 @@
+/*
+ * Licensed t
mrcnc commented on PR #11979:
URL: https://github.com/apache/iceberg/pull/11979#issuecomment-2860319177
> This is useful in case of establishing the mTLS connection
Is this not already possible to configure via system properties (e.g.
`javax.net.ssl.keyStore*`)? I'm wondering if the
dramaticlly commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078360689
##
core/src/test/java/org/apache/iceberg/catalog/CatalogTests.java:
##
@@ -3163,9 +3170,107 @@ public void testRegisterExistingTable() {
assertThatThrownBy((
cccs-jory commented on PR #12999:
URL: https://github.com/apache/iceberg/pull/12999#issuecomment-2860039784
> Considering that the goal is to be able to set it dynamically for each
query
I don't think that's the goal necessarily. For example an org may create a
spark session that exe
dramaticlly commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078358404
##
core/src/test/java/org/apache/iceberg/catalog/CatalogTests.java:
##
@@ -3163,9 +3170,107 @@ public void testRegisterExistingTable() {
assertThatThrownBy((
dramaticlly commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078357142
##
core/src/test/java/org/apache/iceberg/hadoop/TestHadoopCatalog.java:
##
@@ -676,6 +677,27 @@ public void testRegisterExistingTable() throws IOException
{
guykhazma commented on PR #12999:
URL: https://github.com/apache/iceberg/pull/12999#issuecomment-2860002849
Considering that the goal is to be able to set it dynamically for each query
wouldn't it make more sense to see if there is a way to include it in the query
itself instead of having i
RussellSpitzer commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078346013
##
core/src/test/java/org/apache/iceberg/hadoop/TestHadoopCatalog.java:
##
@@ -676,6 +677,27 @@ public void testRegisterExistingTable() throws IOException
{
RussellSpitzer commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078343827
##
core/src/test/java/org/apache/iceberg/catalog/CatalogTests.java:
##
@@ -3163,9 +3170,107 @@ public void testRegisterExistingTable() {
assertThatThrownB
RussellSpitzer commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078343298
##
core/src/test/java/org/apache/iceberg/catalog/CatalogTests.java:
##
@@ -3163,9 +3170,107 @@ public void testRegisterExistingTable() {
assertThatThrownB
RussellSpitzer commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078341793
##
core/src/test/java/org/apache/iceberg/catalog/CatalogTests.java:
##
@@ -3163,9 +3170,107 @@ public void testRegisterExistingTable() {
assertThatThrownB
RussellSpitzer commented on code in PR #12228:
URL: https://github.com/apache/iceberg/pull/12228#discussion_r2078339580
##
core/src/main/java/org/apache/iceberg/BaseMetastoreCatalog.java:
##
@@ -71,23 +70,35 @@ public Table loadTable(TableIdentifier identifier) {
}
@Over
jonathanc-n commented on PR #1290:
URL: https://github.com/apache/iceberg-rust/pull/1290#issuecomment-2859909421
What do you think? @liurenjie1024 @Fokko
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above t
jonathanc-n commented on PR #1290:
URL: https://github.com/apache/iceberg-rust/pull/1290#issuecomment-2859907317
Thanks for this pr! I do not think we should move forward with this yet
before our transaction API is finished. This is a nice start though and can
probably be added later and te
jonathanc-n commented on PR #1285:
URL: https://github.com/apache/iceberg-rust/pull/1285#issuecomment-2859896523
I think one of the original problems that the issue brought up was checking
after deserialization, maybe we can figure something out in that regard?
--
This is an automated mes
jonathanc-n commented on code in PR #1297:
URL: https://github.com/apache/iceberg-rust/pull/1297#discussion_r2078291461
##
crates/catalog/memory/src/catalog.rs:
##
@@ -53,6 +53,19 @@ impl MemoryCatalog {
warehouse_location,
}
}
+
+/// Register an e
amogh-jahagirdar merged PR #12909:
URL: https://github.com/apache/iceberg/pull/12909
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@
jonathanc-n commented on code in PR #1299:
URL: https://github.com/apache/iceberg-rust/pull/1299#discussion_r2078283965
##
crates/iceberg/src/spec/name_mapping/mod.rs:
##
@@ -55,7 +96,7 @@ pub struct MappedField {
#[serde(default)]
#[serde(skip_serializing_if = "Vec::i
jonathanc-n commented on PR #1299:
URL: https://github.com/apache/iceberg-rust/pull/1299#issuecomment-2859866566
cc @jdockerty @liurenjie1024
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the s
1 - 100 of 184 matches
Mail list logo