sdd opened a new pull request, #228:
URL: https://github.com/apache/iceberg-rust/pull/228
Adds a `negate()` method to `UnboundPredicate`, which will be required as
part of #150
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
nastra commented on code in PR #9861:
URL: https://github.com/apache/iceberg/pull/9861#discussion_r1510714880
##
format/spec.md:
##
@@ -791,7 +788,7 @@ Each version of table metadata is stored in a metadata
folder under the table’
Notes:
-1. The file system table scheme is
nastra commented on code in PR #9819:
URL: https://github.com/apache/iceberg/pull/9819#discussion_r1510708069
##
format/spec.md:
##
@@ -60,7 +60,7 @@ In addition to row-level deletes, version 2 makes some
requirements stricter for
## Overview
-![Iceberg snapshot structure]
nastra commented on code in PR #9819:
URL: https://github.com/apache/iceberg/pull/9819#discussion_r1510707814
##
docs/docs/flink.md:
##
@@ -24,20 +24,20 @@ Apache Iceberg supports both [Apache
Flink](https://flink.apache.org/)'s DataStr
| Feature support
manuzhang commented on PR #9819:
URL: https://github.com/apache/iceberg/pull/9819#issuecomment-1975912028
@nastra @bitsondatadev Can you please check this again?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
nastra commented on code in PR #9860:
URL: https://github.com/apache/iceberg/pull/9860#discussion_r1510704394
##
core/src/test/java/org/apache/iceberg/TestMergeAppend.java:
##
@@ -92,6 +93,77 @@ public void testEmptyTableAppend() {
statuses(Status.ADDED, Status.ADDED));
nastra commented on code in PR #9860:
URL: https://github.com/apache/iceberg/pull/9860#discussion_r1510704394
##
core/src/test/java/org/apache/iceberg/TestMergeAppend.java:
##
@@ -92,6 +93,77 @@ public void testEmptyTableAppend() {
statuses(Status.ADDED, Status.ADDED));
nastra commented on code in PR #9862:
URL: https://github.com/apache/iceberg/pull/9862#discussion_r1510699475
##
core/src/test/java/org/apache/iceberg/TableTestBase.java:
##
@@ -185,16 +184,22 @@ public TableTestBase(int formatVersion) {
this.V2Assert = new TableAssertions(
theoryxu opened a new pull request, #9864:
URL: https://github.com/apache/iceberg/pull/9864
For issue:
https://github.com/apache/iceberg/issues/9863
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
nastra commented on issue #9851:
URL: https://github.com/apache/iceberg/issues/9851#issuecomment-1975896032
@gaoshihang I think it would be better to report this issue to Databricks,
as the Iceberg community doesn't maintain Databricks Spark.
--
This is an automated message from the Apach
nastra closed issue #9851: Error when I do compaction in Databricks
URL: https://github.com/apache/iceberg/issues/9851
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubs
sdd commented on PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#issuecomment-1975882804
Ok @liurenjie1024 @Xuanwo and @ZENOTME, I've addressed the feedback and it's
ready to review again. Thanks!
--
This is an automated message from the Apache Git Service.
To respond to the
sdd commented on code in PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#discussion_r1510673592
##
crates/iceberg/src/scan.rs:
##
@@ -163,24 +164,32 @@ impl TableScan {
Ok(iter(file_scan_tasks).boxed())
}
+
+pub async fn execute(
+&self,
sdd commented on code in PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#discussion_r1510673318
##
crates/iceberg/src/scan.rs:
##
@@ -445,4 +494,22 @@ mod tests {
format!("{}/3.parquet", &fixture.table_location)
);
}
+
+#[tokio::test]
theoryxu opened a new issue, #9863:
URL: https://github.com/apache/iceberg/issues/9863
### Feature Request / Improvement
There is Ambiguity in the chapter "Hive type to Iceberg type."
For general hive users, they think the Hive type is the Hive Data type.
However, the Hi
liurenjie1024 commented on code in PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#discussion_r1510666374
##
crates/iceberg/src/scan.rs:
##
@@ -163,24 +164,32 @@ impl TableScan {
Ok(iter(file_scan_tasks).boxed())
}
+
+pub async fn execute(
+
sdd commented on code in PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#discussion_r1510659973
##
crates/iceberg/src/file_record_batch_reader.rs:
##
@@ -0,0 +1,83 @@
+// Licensed to the Apache Software Foundation (ASF) under one
+// or more contributor license agr
sdd commented on code in PR #207:
URL: https://github.com/apache/iceberg-rust/pull/207#discussion_r1510658654
##
crates/iceberg/src/scan.rs:
##
@@ -163,24 +164,32 @@ impl TableScan {
Ok(iter(file_scan_tasks).boxed())
}
+
+pub async fn execute(
+&self,
JGynther commented on issue #452:
URL: https://github.com/apache/iceberg-python/issues/452#issuecomment-1975812563
Finally had a chance to poke this.
To me it seems that there is no easy way out to implement this. When
creating and scanning a StaticTable the actual location of a parti
nk1506 commented on issue #9085:
URL: https://github.com/apache/iceberg/issues/9085#issuecomment-1975752088
@tomtongue , I was working with
[TestOverwrite](https://github.com/apache/iceberg/blob/main/core/src/test/java/org/apache/iceberg/TestOverwrite.java)
and to move this to jUnit5 I had
himadripal commented on PR #495:
URL: https://github.com/apache/iceberg-python/pull/495#issuecomment-1975671879
we also need to update this part of the doc as part of #486.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and u
bitsondatadev opened a new pull request, #9861:
URL: https://github.com/apache/iceberg/pull/9861
Continuation from #9843. We wanted to get the image fix on the spec. This PR
fixes other updates needed under the `format/` markdown files.
--
This is an automated message from the Apache Git
himadripal opened a new pull request, #495:
URL: https://github.com/apache/iceberg-python/pull/495
docs update related to #484 @Fokko
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
huyuanfeng2018 commented on code in PR #9606:
URL: https://github.com/apache/iceberg/pull/9606#discussion_r1510521311
##
flink/v1.18/flink/src/main/java/org/apache/iceberg/flink/FlinkSchemaUtil.java:
##
@@ -64,26 +68,75 @@ public static Schema convert(TableSchema schema) {
github-actions[bot] commented on issue #1865:
URL: https://github.com/apache/iceberg/issues/1865#issuecomment-1975414030
This issue has been automatically marked as stale because it has been open
for 180 days with no activity. It will be closed in next 14 days if no further
activity occurs.
stevenzwu commented on code in PR #9606:
URL: https://github.com/apache/iceberg/pull/9606#discussion_r1509461051
##
flink/v1.18/flink/src/main/java/org/apache/iceberg/flink/FlinkSchemaUtil.java:
##
@@ -64,26 +68,75 @@ public static Schema convert(TableSchema schema) {
RowTy
HonahX merged PR #493:
URL: https://github.com/apache/iceberg-python/pull/493
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg
HonahX commented on PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#issuecomment-1975404888
Thanks for fixing the CI config!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the sp
kevinjqliu commented on PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#issuecomment-1975396483
I just pushed the change following the same format as
https://github.com/apache/iceberg-python/blob/main/.github/workflows/python-ci.yml#L25
this PR should now run the `chec
kevinjqliu commented on PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#issuecomment-1975395425
Looks like its related to this
https://stackoverflow.com/questions/61989951/github-action-workflow-not-running
CI and integration test github actions both specifically r
kevinjqliu commented on PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#issuecomment-1975394112
In fact, I'm not sure if the `Check Markdown links` github action has ever
ran for this repo
https://github.com/apache/iceberg-python/actions/workflows/check-md-link.yml
-
kevinjqliu commented on PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#issuecomment-1975393944
@HonahX that is weird. The github action is currently configured to run when
there are changes to `mkdocs/**`, which should definitely include this PR.
https://github.com/apach
HonahX commented on code in PR #473:
URL: https://github.com/apache/iceberg-python/pull/473#discussion_r1510415931
##
pyiceberg/serializers.py:
##
@@ -127,6 +127,6 @@ def table_metadata(metadata: TableMetadata, output_file:
OutputFile, overwrite:
overwrite (bool):
kevinjqliu commented on code in PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#discussion_r1510411481
##
mkdocs/docs/SUMMARY.md:
##
@@ -26,6 +26,6 @@
- Releases
- [Verify a release](verify-release.md)
- [How to release](how-to-release.md)
-- [Code Ref
HonahX commented on code in PR #493:
URL: https://github.com/apache/iceberg-python/pull/493#discussion_r1510408594
##
mkdocs/docs/SUMMARY.md:
##
@@ -26,6 +26,6 @@
- Releases
- [Verify a release](verify-release.md)
- [How to release](how-to-release.md)
-- [Code Referen
HonahX closed issue #492: Integration test broken
URL: https://github.com/apache/iceberg-python/issues/492
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-ma
HonahX merged PR #494:
URL: https://github.com/apache/iceberg-python/pull/494
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg
kevinjqliu commented on issue #492:
URL: https://github.com/apache/iceberg-python/issues/492#issuecomment-1975369798
@HonahX opened #494
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the speci
kevinjqliu opened a new pull request, #494:
URL: https://github.com/apache/iceberg-python/pull/494
Fixes #492
Thanks @HonahX for the provided solution :)
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HonahX commented on issue #492:
URL: https://github.com/apache/iceberg-python/issues/492#issuecomment-1975369134
Could you please open one up? I can quickly merge that 😄 .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and us
kevinjqliu commented on issue #492:
URL: https://github.com/apache/iceberg-python/issues/492#issuecomment-1975368576
makes sense, I'm in PT (UTC-8) which matches the difference between `464603`
and `464611`.
Just ran `make test-integration` with the
```
.config("spark.
kevinjqliu commented on issue #492:
URL: https://github.com/apache/iceberg-python/issues/492#issuecomment-1975368667
Do you already have a PR out for the change? Or should I push one up?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
HonahX commented on issue #492:
URL: https://github.com/apache/iceberg-python/issues/492#issuecomment-1975346887
I also hit this today. I think this is because the spark session uses our
local timezone when inserting timestamp values into that table. We may config
the Spark Session to alway
HonahX commented on code in PR #363:
URL: https://github.com/apache/iceberg-python/pull/363#discussion_r1510403841
##
tests/conftest.py:
##
@@ -1961,6 +1961,7 @@ def spark() -> SparkSession:
spark = (
SparkSession.builder.appName("PyIceberg integration test")
+
rdblue closed issue #9712: Apache Project website check is failing
URL: https://github.com/apache/iceberg/issues/9712
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubsc
rdblue merged PR #9729:
URL: https://github.com/apache/iceberg/pull/9729
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apac
rdblue merged PR #9854:
URL: https://github.com/apache/iceberg/pull/9854
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apac
bitsondatadev commented on PR #9729:
URL: https://github.com/apache/iceberg/pull/9729#issuecomment-1975307461
Huge shoutout to @munabedan for styling the footer...looks great!
This one is ready for reviews. These should address all the compliance
issues our site has currently.
danielcweeks commented on PR #9685:
URL: https://github.com/apache/iceberg/pull/9685#issuecomment-1975284396
I don't think we can include this by default. The issue is if client is ever
created without explicitly setting the HTTP implementation and there are two
implementations on the class
ismailsimsek commented on PR #9685:
URL: https://github.com/apache/iceberg/pull/9685#issuecomment-1975274866
+1 adding it. without this dependency getting following error
`Caused by: java.lang.NoClassDefFoundError:
com/amazonaws/event/ProgressListener`
--
This i
himadripal commented on PR #486:
URL: https://github.com/apache/iceberg-python/pull/486#issuecomment-1975262865
Requesting for review one this one please.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above t
himadripal commented on PR #9839:
URL: https://github.com/apache/iceberg/pull/9839#issuecomment-1975260924
Can I get someone's attention for a review on this one please?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
ismailsimsek commented on issue #6205:
URL: https://github.com/apache/iceberg/issues/6205#issuecomment-1975257517
Thank you. ran into same error and it solved my error.
`Caused by: java.lang.ClassCastException: class
org.apache.parquet.schema.MessageType cannot be cast to class
org.a
rdblue commented on code in PR #9660:
URL: https://github.com/apache/iceberg/pull/9660#discussion_r1510344499
##
open-api/rest-catalog-open-api.yaml:
##
@@ -1581,6 +1608,17 @@ components:
type: string
example: [ "accounting", "tax" ]
+PageToken:
+ desc
rdblue commented on code in PR #9660:
URL: https://github.com/apache/iceberg/pull/9660#discussion_r1510344315
##
open-api/rest-catalog-open-api.yaml:
##
@@ -1581,6 +1608,17 @@ components:
type: string
example: [ "accounting", "tax" ]
+PageToken:
+ desc
fqaiser94 opened a new pull request, #9860:
URL: https://github.com/apache/iceberg/pull/9860
# What is the problem?
Currently the `table.newAppend()` API expects users to provide Datafiles
with the same PartitionSpec via `.appendFile()`.
Failure to do so
[raises](https://github.c
a-agmon commented on issue #6619:
URL: https://github.com/apache/iceberg/issues/6619#issuecomment-1975202872
@amogh-jahagirdar - I just wanted to say that we had a very problematic
recovery case today, and your comment saved the day. We simply changed the
metadata_location in glue and were
57 matches
Mail list logo