manuzhang commented on issue #13330:
URL: https://github.com/apache/iceberg/issues/13330#issuecomment-2994871619

   The most relevant case I've seen is HadoopCatalog which relies on 
`metadata.json` files to load the table. For high frequency and concurrency 
updates, it's possible that the `metadata.json` you are able to load just get 
deleted if `write.metadata.previous-versions-max` is set to 10. For example, 
when `partial-progress` is enabled with rewrite_data_files, the default maximum 
commit is 10. It's easy to fail in such cases.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to