ajantha-bhat commented on code in PR #14868:
URL: https://github.com/apache/iceberg/pull/14868#discussion_r2643779578


##########
core/src/test/java/org/apache/iceberg/view/ViewCatalogTests.java:
##########
@@ -1968,4 +1970,87 @@ public void dropNonEmptyNamespace() {
         .as("Namespace should not exist")
         .isFalse();
   }
+
+  @Test
+  public void registerView() {
+    C catalog = catalog();
+
+    // Register view is not yet supported for REST catalog
+    assumeThat(catalog instanceof RESTCatalog).isFalse();
+
+    TableIdentifier identifier = TableIdentifier.of("ns", "view");
+
+    if (requiresNamespaceCreate()) {
+      catalog.createNamespace(identifier.namespace());
+    }
+
+    View originalView =
+        catalog()
+            .buildView(identifier)
+            .withSchema(SCHEMA)
+            .withDefaultNamespace(identifier.namespace())
+            .withDefaultCatalog(catalog().name())
+            .withQuery("spark", "select * from ns.tbl")
+            .withProperty(GC_ENABLED, "false")
+            .create();
+
+    ViewOperations ops = ((BaseView) originalView).operations();
+    String metadataLocation = ops.current().metadataFileLocation();
+
+    assertThat(catalog.dropView(identifier)).isTrue();
+    assertThat(catalog.viewExists(identifier)).as("View must not 
exist").isFalse();
+
+    // view metadata should still exist after dropping the view as gc is 
disabled
+    assertThat(((BaseViewOperations) 
ops).io().newInputFile(metadataLocation).exists()).isTrue();
+
+    View registeredView = catalog.registerView(identifier, metadataLocation);
+
+    assertThat(registeredView).isNotNull();
+    assertThat(catalog.viewExists(identifier)).as("View must exist").isTrue();
+    assertThat(registeredView.schema().asStruct())
+        .as("Schema must match")
+        .isEqualTo(originalView.schema().asStruct());
+    assertThat(registeredView.currentVersion())
+        .as("Current version must match")
+        .isEqualTo(originalView.currentVersion());
+    assertThat(registeredView.versions())

Review Comment:
   I have added a test on `TestViews` to use views which has invalid catalog 
name. 
   
   ```
   @TestTemplate
       public void readFromViewWithInvalidCatalogName() throws 
NoSuchTableException {
           insertRows(10);
           String viewName = viewName("simpleView");
           String sql = String.format("SELECT id FROM %s", tableName);
   
           ViewCatalog viewCatalog = viewCatalog();
   
           viewCatalog
                   .buildView(TableIdentifier.of(NAMESPACE, viewName))
                   .withQuery("spark", sql)
                   // use non-existing column name to make sure only the SQL 
definition for spark is loaded
                   .withQuery("trino", String.format("SELECT non_existing FROM 
%s", tableName))
                   .withDefaultNamespace(NAMESPACE)
                   .withDefaultCatalog("invalid_catalog")
                   .withSchema(schema(sql))
                   .create();
   
           sql("SELECT * FROM %s", viewName);
       }
    ```
    
    querying the view in  `sql("SELECT * FROM %s", viewName)` fails with below 
error 
    
    ```
   org.apache.spark.sql.AnalysisException: [TABLE_OR_VIEW_NOT_FOUND] The table 
or view `invalid_catalog`.`default`.`table` cannot be found. Verify the 
spelling and correctness of the schema and catalog.
   If you did not qualify the name with a schema, verify the current_schema() 
output, or qualify the name with the correct schema and catalog.
   To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF 
EXISTS. SQLSTATE: 42P01; line 1 pos 15;
   'Project [*]
   +- 'SubqueryAlias simpleView969944
      +- 'Project [upcast(getcolumnbyordinal(0, IntegerType), IntegerType) AS 
id#16]
         +- 'Project ['id]
            +- 'UnresolvedRelation [invalid_catalog, default, table], [], false
   
        at 
org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.tableNotFound(package.scala:91)
        at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis0$2(CheckAnalysis.scala:306)
        at 
org.apache.spark.sql.catalyst.analysis.CheckAnalysis.$anonfun$checkAnalysis0$2$adapted(CheckAnalysis.scala:284)
   ```
   
   So, spark cannot query the view registered from other catalog and it is a 
major interop concern (I vaguely remember brining this up while standardizing 
the view spec). 
   
   I don't think the register function should alter the catalog name as it will 
leads to new `versionUpdate` with new version id and creates a new metadata 
file which is not the purpose of the register function. So, spark users may not 
easily use that view (without using some APIs to alter it). 
   
   I think we need broader discussion for this and @nastra is on holidays. We 
should get back once he is back. 



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to