varpa89 opened a new issue, #12466: URL: https://github.com/apache/iceberg/issues/12466
### Apache Iceberg version 1.7.1 ### Query engine Spark ### Please describe the bug 🐞 When we use SparkCatalog with Iceberg Rest Catalog and try to select from metadata tables described in the [documentation](https://iceberg.apache.org/docs/latest/spark-queries/) communication between Spark and catalog looks wrong ``` SparkSession spark = SparkSession.builder() .appName("Demo") .config("spark.sql.catalog.kometa", "org.apache.iceberg.spark.SparkCatalog") .config("spark.sql.catalog.kometa.type", "rest") .config("spark.sql.catalog.kometa.uri", "http://localhost:8082/iceberg/default") //Our rest catalog .config("spark.sql.catalog.kometa.warehouse", "s3a://kometa/default") .config("spark.sql.catalog.kometa.s3.endpoint", "http://localhost:9000") //Minio .config("spark.sql.catalog.kometa.prefix", "main") .config("spark.sql.catalog.kometa.s3.path-style-access", "true") .config("spark.sql.catalog.kometa.cache-enabled", "false") // Disabling cache. Important .config("spark.sql.extensions", "org.apache.iceberg.spark.extensions.IcebergSparkSessionExtensions") .master("local[*]") .getOrCreate(); spark.sql("CREATE NAMESPACE kometa.data"); spark.sql("CREATE TABLE kometa.data.info (id INT, name STRING) USING iceberg PARTITIONED BY (id)"); spark.sql("SELECT * FROM kometa.data.info.snapshots").collectAsList() //Does not work spark.sql("SELECT * FROM kometa.data.info.partitions").collectAsList() //Does not work ``` The queries go to `GET /iceberg/default/v1/main/namespaces/data.info/tables/snapshots` and `GET /iceberg/default/v1/main/namespaces/data.info/tables/partitions` and we have 404 because catalog does not have namespace `data.info` (data is a namespace and info is a table but passed as a part of the namespace), snapshots/partitions should not be part of the request if I understand the [contract](https://editor-next.swagger.io/?url=https://raw.githubusercontent.com/apache/iceberg/main/open-api/rest-catalog-open-api.yaml) properly Important thing is that `spark.sql.catalog.kometa.s3.path-style-access` property should be true, otherwise table is cached and we just don't call the catalog at all Spark version: ``` testImplementation("org.apache.spark:spark-sql_2.13:3.5.4") testImplementation("org.apache.iceberg:iceberg-spark-runtime-3.5_2.13:1.7.1") ``` ### Willingness to contribute - [ ] I can contribute a fix for this bug independently - [ ] I would be willing to contribute a fix for this bug with guidance from the Iceberg community - [ ] I cannot contribute a fix for this bug at this time -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org For additional commands, e-mail: issues-h...@iceberg.apache.org