ideal opened a new issue, #10772:
URL: https://github.com/apache/iceberg/issues/10772

   ### Apache Iceberg version
   
   1.2.1
   
   ### Query engine
   
   Flink
   
   ### Please describe the bug 🐞
   
   Flink version: 1.14.5
   Hive version: 3.1.3
   Java version: java version "1.8.0_411"
   
   Running: `bin/sql-client.sh -j lib/iceberg-flink-runtime-1.14-1.2.1.jar -j 
lib/hive-metastore-2.3.8.jar -j lib/libthrift-0.9.3.jar`
   
   ```
   CREATE CATALOG icebergtest WITH (
       'type'='iceberg',
       'catalog-type'='hive',
       'catalog-database'='default',
       'uri'='thrift://localhost:9083',
       'warehouse'='s3a://bucketname/warehouse'
   );
   ```
   
   Got error:
   
   ```
   Exception in thread "main" org.apache.flink.table.client.SqlClientException: 
Unexpected exception. This is a bug. Please consider filing an issue.
           at 
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:201)
           at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161)
   Caused by: java.lang.VerifyError: Stack map does not match the one at 
exception handler 20
   Exception Details:
     Location:
       
org/apache/iceberg/hive/HiveCatalog.alterHiveDataBase(Lorg/apache/iceberg/catalog/Namespace;Lorg/apache/hadoop/hive/metastore/api/Database;)V
 @20: astore_3
     Reason:
       Type 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' 
(current frame, stack[0]) is not assignable to 'org/apache/thrift/TException' 
(stack map, stack[0])
     Current Frame:
       bci: @0
       flags: { }
       locals: { 'org/apache/iceberg/hive/HiveCatalog', 
'org/apache/iceberg/catalog/Namespace', 
'org/apache/hadoop/hive/metastore/api/Database' }
       stack: { 'org/apache/hadoop/hive/metastore/api/NoSuchObjectException' }
     Stackmap Frame:
       bci: @20
       flags: { }
       locals: { 'org/apache/iceberg/hive/HiveCatalog', 
'org/apache/iceberg/catalog/Namespace', 
'org/apache/hadoop/hive/metastore/api/Database' }
       stack: { 'org/apache/thrift/TException' }
     Bytecode:
       0x0000000: 2ab4 00a6 2b2c ba02 9600 00b9 00da 0200
       0x0000010: 57a7 0066 4ebb 0122 592d 1301 2404 bd01
       0x0000020: 2659 032b 53b7 0297 bf4e bb01 2b59 bb01
       0x0000030: 2d59 b701 2e13 0299 b601 342b b601 3713
       0x0000040: 0210 b601 34b6 013b 2db7 013e bf4e b801
       0x0000050: 44b6 0147 bb01 2b59 bb01 2d59 b701 2e13
       0x0000060: 029b b601 342b b601 3713 0210 b601 34b6
       0x0000070: 013b 2db7 013e bfb1
     Exception Handler Table:
       bci [0, 17] => handler: 20
       bci [0, 17] => handler: 20
       bci [0, 17] => handler: 41
       bci [0, 17] => handler: 77
     Stackmap Table:
       same_locals_1_stack_item_frame(@20,Object[#176])
       same_locals_1_stack_item_frame(@41,Object[#176])
       same_locals_1_stack_item_frame(@77,Object[#178])
       same_frame(@119)
   
           at java.lang.Class.forName0(Native Method)
           at java.lang.Class.forName(Class.java:348)
           at 
org.apache.iceberg.common.DynConstructors$Builder.impl(DynConstructors.java:149)
           at org.apache.iceberg.CatalogUtil.loadCatalog(CatalogUtil.java:221)
           at 
org.apache.iceberg.flink.CatalogLoader$HiveCatalogLoader.loadCatalog(CatalogLoader.java:118)
           at 
org.apache.iceberg.flink.FlinkCatalog.<init>(FlinkCatalog.java:114)
           at 
org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:167)
           at 
org.apache.iceberg.flink.FlinkCatalogFactory.createCatalog(FlinkCatalogFactory.java:140)
           at 
org.apache.flink.table.factories.FactoryUtil.createCatalog(FactoryUtil.java:267)
           at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.createCatalog(TableEnvironmentImpl.java:1292)
           at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1122)
           at 
org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:209)
           at 
org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:88)
           at 
org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:209)
           at 
org.apache.flink.table.client.cli.CliClient.executeOperation(CliClient.java:625)
           at 
org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:447)
           at 
org.apache.flink.table.client.cli.CliClient.lambda$executeStatement$1(CliClient.java:332)
           at java.util.Optional.ifPresent(Optional.java:159)
           at 
org.apache.flink.table.client.cli.CliClient.executeStatement(CliClient.java:325)
           at 
org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:297)
           at 
org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:221)
           at 
org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
           at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
           at 
org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187)
           ... 1 more
   
   Shutting down the session...
   done.
   ```
   
   
   
   ### Willingness to contribute
   
   - [ ] I can contribute a fix for this bug independently
   - [X] I would be willing to contribute a fix for this bug with guidance from 
the Iceberg community
   - [ ] I cannot contribute a fix for this bug at this time


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@iceberg.apache.org
For additional commands, e-mail: issues-h...@iceberg.apache.org

Reply via email to