dor-bernstein opened a new issue, #1946: URL: https://github.com/apache/iceberg-python/issues/1946
### Apache Iceberg version None ### Please describe the bug 🐞 Hey, I have a large arrow table that I want to append to a partitioned iceberg table. I'm working locally with dockers and I'm using the tabulario/iceberg-rest:1.6.0 as my rest catalog. To avoid OOMs, I'm splitting the arrow table into chunks. When using regular appends everything works as expected. However, I want to append all data in a single transaction. I have this code that does that: ```python with table.transaction() as tx: for offset in range(0, data.num_rows, MAX_APPEND_CHUNK_SIZE): data_slice = data.slice(offset, MAX_APPEND_CHUNK_SIZE) logger.info(f'Writing batch of {data_slice.num_rows} with offset {offset} to table {table.name()}') tx.append(data_slice) tx.commit_transaction() ``` The table is empty and was created in a different task. I get the following error - CommitFailedException: Requirement failed: branch main was created concurrently. When retrying I get this error pyiceberg.exceptions.CommitFailedException: CommitFailedException: Requirement failed: branch main has changed: expected id 4547037169132709864 != 132570956257248456. Any help would be appreciated, Thanks! ### Willingness to contribute - [ ] I can contribute a fix for this bug independently - [ ] I would be willing to contribute a fix for this bug with guidance from the Iceberg community - [x] I cannot contribute a fix for this bug at this time -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org For additional commands, e-mail: issues-h...@iceberg.apache.org