RussellSpitzer commented on code in PR #6886:
URL: https://github.com/apache/iceberg/pull/6886#discussion_r1174057983


##########
spark/v3.3/spark/src/test/java/org/apache/iceberg/spark/sql/TestCreateTable.java:
##########
@@ -341,4 +343,42 @@ public void 
testDowngradeTableToFormatV1ThroughTablePropertyFails() {
         "Cannot downgrade v2 table to v1",
         () -> sql("ALTER TABLE %s SET TBLPROPERTIES ('format-version'='1')", 
tableName));
   }
+
+  @Test
+  public void testCatalogSpecificWarehouseLocation() throws Exception {
+    File warehouseLocation = temp.newFolder(catalogName);
+    Assert.assertTrue(warehouseLocation.delete());
+    String location = "file:" + warehouseLocation.getAbsolutePath();
+    spark.conf().set("spark.sql.catalog." + catalogName + ".warehouse", 
location);
+    String testNameSpace =
+        (catalogName.equals("spark_catalog") ? "" : catalogName + ".") + 
"default1";
+    String testTableName = testNameSpace + ".table";
+    TableIdentifier newTableIdentifier = TableIdentifier.of("default1", 
"table");
+
+    sql("CREATE NAMESPACE IF NOT EXISTS %s", testNameSpace);
+    sql("CREATE TABLE %s " + "(id BIGINT NOT NULL, data STRING) " + "USING 
iceberg", testTableName);
+
+    Table table = Spark3Util.loadIcebergCatalog(spark, 
catalogName).loadTable(newTableIdentifier);
+    if ("spark_catalog".equals(catalogName)) {
+      
Assert.assertTrue(table.location().startsWith(spark.sqlContext().conf().warehousePath()));
+    } else {
+      Assert.assertTrue(table.location().startsWith(location));
+    }
+    sql("DROP TABLE IF EXISTS %s", testTableName);
+    sql("DROP NAMESPACE IF EXISTS %s", testNameSpace);
+  }
+
+  @Test
+  public void testCatalogWithSparkSqlWarehouseDir() {
+    spark.conf().unset("spark.sql.catalog." + catalogName + ".warehouse");

Review Comment:
   Looks like you figured it out. Like I was mentioning what you want to do in 
this case is make a brand new session (not a new context). You can have 
multiple sessions per spark context, each with different runtime configuration. 
So when we want to do something within a test to configure a session 
differently you can just call "newSession" at the beginning and use that new 
session in the test.



##########
spark/v3.3/spark/src/test/java/org/apache/iceberg/spark/sql/TestCreateTable.java:
##########
@@ -341,4 +343,42 @@ public void 
testDowngradeTableToFormatV1ThroughTablePropertyFails() {
         "Cannot downgrade v2 table to v1",
         () -> sql("ALTER TABLE %s SET TBLPROPERTIES ('format-version'='1')", 
tableName));
   }
+
+  @Test
+  public void testCatalogSpecificWarehouseLocation() throws Exception {
+    File warehouseLocation = temp.newFolder(catalogName);
+    Assert.assertTrue(warehouseLocation.delete());
+    String location = "file:" + warehouseLocation.getAbsolutePath();
+    spark.conf().set("spark.sql.catalog." + catalogName + ".warehouse", 
location);
+    String testNameSpace =
+        (catalogName.equals("spark_catalog") ? "" : catalogName + ".") + 
"default1";
+    String testTableName = testNameSpace + ".table";
+    TableIdentifier newTableIdentifier = TableIdentifier.of("default1", 
"table");
+
+    sql("CREATE NAMESPACE IF NOT EXISTS %s", testNameSpace);
+    sql("CREATE TABLE %s " + "(id BIGINT NOT NULL, data STRING) " + "USING 
iceberg", testTableName);
+
+    Table table = Spark3Util.loadIcebergCatalog(spark, 
catalogName).loadTable(newTableIdentifier);
+    if ("spark_catalog".equals(catalogName)) {
+      
Assert.assertTrue(table.location().startsWith(spark.sqlContext().conf().warehousePath()));
+    } else {
+      Assert.assertTrue(table.location().startsWith(location));
+    }
+    sql("DROP TABLE IF EXISTS %s", testTableName);
+    sql("DROP NAMESPACE IF EXISTS %s", testNameSpace);
+  }
+
+  @Test
+  public void testCatalogWithSparkSqlWarehouseDir() {
+    spark.conf().unset("spark.sql.catalog." + catalogName + ".warehouse");

Review Comment:
    Like I was mentioning what you want to do in this case is make a brand new 
session (not a new context). You can have multiple sessions per spark context, 
each with different runtime configuration. So when we want to do something 
within a test to configure a session differently you can just call "newSession" 
at the beginning and use that new session in the test.



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to