FrommyMind opened a new issue, #25099:
URL: https://github.com/apache/doris/issues/25099

   ### Search before asking
   
   - [X] I had searched in the 
[issues](https://github.com/apache/doris/issues?q=is%3Aissue) and found no 
similar issues.
   
   
   ### Version
   
   Doris Version: doris-2.0.2-rc05-ae923f7
   Java Version: 1.8.0_261
   OS Version: CentOS Linux release 7.3.1611 (Core)  3.10.0-514.el7.x86_64
   Hive Version: 2.1.1+cdh6.3.2
   
   ### What's Wrong?
   
   hive external table can been created, but cannot query it.
   
   create hive external table with bellow ddl
   ```sql
   create EXTERNAL table  date_dim (
     `d_date_sk` int, 
     `d_date_id` varchar(16), 
     `d_date` varchar(10), 
     `d_month_seq` int, 
     `d_week_seq` int, 
     `d_quarter_seq` int, 
     `d_year` int, 
     `d_dow` int, 
     `d_moy` int, 
     `d_dom` int, 
     `d_qoy` int, 
     `d_fy_year` int, 
     `d_fy_quarter_seq` int, 
     `d_fy_week_seq` int, 
     `d_day_name` varchar(9), 
     `d_quarter_name` varchar(6), 
     `d_holiday` varchar(1), 
     `d_weekend` varchar(1), 
     `d_following_holiday` varchar(1), 
     `d_first_dom` int, 
     `d_last_dom` int, 
     `d_same_day_ly` int, 
     `d_same_day_lq` int, 
     `d_current_day` varchar(1), 
     `d_current_week` varchar(1), 
     `d_current_month` varchar(1), 
     `d_current_quarter` varchar(1), 
     `d_current_year` varchar(1))
     ENGINE=hive
   PROPERTIES (
       "database" = "tpcds_10000_parquet",
       "table" = "date_dim",
       "hive.metastore.uris" = "thrift://xx.xx.xx.xx:9083",
        'hadoop.username' = 'user',
       'dfs.nameservices'='<nameservice>',
       'dfs.ha.namenodes.<nameservice>'='<namenode1>,<namenode2>',
       
'dfs.namenode.rpc-address.<nameservice>.<namenode1>'='xxx.xxx.xxx.xxx:8022',
       
'dfs.namenode.rpc-address.<nameservice>.<namenode2>'='xxx.xxx.xxx.xxx:8020',
       
'dfs.client.failover.proxy.provider.nameservice1'='org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider',
        'hive.version' = '2.1.1'
   );
   ```
   
   table status
   ```
   MySQL [tpch_test]> show table status from tpch_test like '%date_dim'\G;
   *************************** 1. row ***************************
              Name: date_dim
            Engine: Hive
           Version: NULL
        Row_format: NULL
              Rows: 1
    Avg_row_length: 0
       Data_length: 0
   Max_data_length: NULL
      Index_length: NULL
         Data_free: NULL
    Auto_increment: NULL
       Create_time: 2023-10-07 19:25:23
       Update_time: NULL
        Check_time: NULL
         Collation: utf-8
          Checksum: NULL
    Create_options: NULL
           Comment: HIVE
   1 row in set (0.00 sec)
   
   ERROR: No query specified
   
   MySQL [tpch_test]> 
   ```
   
   show create table
   ```sql
   MySQL [tpch_test]> show create table date_dim;
   
+----------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
 
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
   | Table    | Create Table                                                    
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                                                                
                                      
                                            |
   
+----------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
 
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
   | date_dim | CREATE EXTERNAL TABLE `date_dim` (
     `d_date_sk` int(11) NULL,
     `d_date_id` varchar(16) NULL,
     `d_date` varchar(10) NULL,
     `d_month_seq` int(11) NULL,
     `d_week_seq` int(11) NULL,
     `d_quarter_seq` int(11) NULL,
     `d_year` int(11) NULL,
     `d_dow` int(11) NULL,
     `d_moy` int(11) NULL,
     `d_dom` int(11) NULL,
     `d_qoy` int(11) NULL,
     `d_fy_year` int(11) NULL,
     `d_fy_quarter_seq` int(11) NULL,
     `d_fy_week_seq` int(11) NULL,
     `d_day_name` varchar(9) NULL,
     `d_quarter_name` varchar(6) NULL,
     `d_holiday` varchar(1) NULL,
     `d_weekend` varchar(1) NULL,
     `d_following_holiday` varchar(1) NULL,
     `d_first_dom` int(11) NULL,
     `d_last_dom` int(11) NULL,
     `d_same_day_ly` int(11) NULL,
     `d_same_day_lq` int(11) NULL,
     `d_current_day` varchar(1) NULL,
     `d_current_week` varchar(1) NULL,
     `d_current_month` varchar(1) NULL,
     `d_current_quarter` varchar(1) NULL,
     `d_current_year` varchar(1) NULL
   ) ENGINE=HIVE
   COMMENT 'HIVE'
   PROPERTIES (
   "database" = "tpcds_10000_parquet",
   "table" = "date_dim",
   "dfs.ha.namenodes.<nameservice>"  =  "<namenode1>,<namenode2>",
   "dfs.namenode.rpc-address.<nameservice>.<namenode1>"  =  
"xxx.xxx.xxx.xx:8022",
   "hadoop.security.authentication"  =  "simple",
   "hive.version"  =  "2.1.1",
   "hive.metastore.uris"  =  "thrift://xxx.xxx.xxx.xxx:9083",
   "dfs.nameservices"  =  "<nameservice>",
   "dfs.client.failover.proxy.provider.<nameservice>"  =  
"org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
   "hadoop.username"  =  "usernname",
   "dfs.namenode.rpc-address.<nameservice>.<namenode1>"  =  
"xxx.xxx.xxx.xxx:8020"
   ); |
   
+----------+-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
 
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
   1 row in set (0.00 sec)
   ```
   
   
   but when try to query it 
   ```sql
   MySQL [tpch_test]> select count(1) from date_dim;
   ERROR 1105 (HY000): errCode = 2, detailMessage = Unexpected exception: 
Unsupported tableType:HIVE
   MySQL [tpch_test]> 
   ```
   
   error log from fe
   ```java
   2023-10-07 19:25:38,975 WARN (mysql-nio-pool-7|913) 
[StmtExecutor.execute():442] Analyze failed. stmt[64, 
c9ec6b94d404b48-b6dd89f19b946e6a]
   org.apache.doris.common.NereidsException: errCode = 2, detailMessage = 
Unexpected exception: Unsupported tableType:HIVE
           at 
org.apache.doris.qe.StmtExecutor.executeByNereids(StmtExecutor.java:558) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at org.apache.doris.qe.StmtExecutor.execute(StmtExecutor.java:432) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at org.apache.doris.qe.StmtExecutor.execute(StmtExecutor.java:422) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.handleQuery(ConnectProcessor.java:435) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.dispatch(ConnectProcessor.java:583) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.processOnce(ConnectProcessor.java:834) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.mysql.ReadListener.lambda$handleEvent$0(ReadListener.java:52) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
~[?:1.8.0_261]
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
~[?:1.8.0_261]
           at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_261]
   2023-10-07 19:25:38,975 WARN (mysql-nio-pool-7|913) 
[ConnectProcessor.handleQueryException():471] Process one query failed because.
   org.apache.doris.common.AnalysisException: errCode = 2, detailMessage = 
Unexpected exception: Unsupported tableType:HIVE
           at 
org.apache.doris.qe.StmtExecutor.executeByNereids(StmtExecutor.java:558) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at org.apache.doris.qe.StmtExecutor.execute(StmtExecutor.java:432) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at org.apache.doris.qe.StmtExecutor.execute(StmtExecutor.java:422) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.handleQuery(ConnectProcessor.java:435) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.dispatch(ConnectProcessor.java:583) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.ConnectProcessor.processOnce(ConnectProcessor.java:834) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.mysql.ReadListener.lambda$handleEvent$0(ReadListener.java:52) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) 
~[?:1.8.0_261]
           at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) 
~[?:1.8.0_261]
           at java.lang.Thread.run(Thread.java:748) ~[?:1.8.0_261]
   Caused by: org.apache.doris.nereids.exceptions.AnalysisException: 
Unsupported tableType:HIVE
           at 
org.apache.doris.nereids.rules.analysis.BindRelation.getLogicalPlan(BindRelation.java:230)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.rules.analysis.BindRelation.bindWithCurrentDb(BindRelation.java:150)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.rules.analysis.BindRelation.doBindRelation(BindRelation.java:106)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.rules.analysis.BindRelation.lambda$build$0(BindRelation.java:90)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.pattern.PatternMatcher$1.transform(PatternMatcher.java:92)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.rewrite.PlanTreeRewriteJob.rewrite(PlanTreeRewriteJob.java:49)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.rewrite.PlanTreeRewriteBottomUpJob.rewriteThis(PlanTreeRewriteBottomUpJob.java:116)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.rewrite.PlanTreeRewriteBottomUpJob.execute(PlanTreeRewriteBottomUpJob.java:80)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.scheduler.SimpleJobScheduler.executeJobPool(SimpleJobScheduler.java:39)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.rewrite.RootPlanTreeRewriteJob.execute(RootPlanTreeRewriteJob.java:52)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.executor.AbstractBatchJobExecutor.execute(AbstractBatchJobExecutor.java:119)
 ~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.jobs.executor.Analyzer.analyze(Analyzer.java:79) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.NereidsPlanner.analyze(NereidsPlanner.java:283) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.NereidsPlanner.plan(NereidsPlanner.java:198) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.nereids.NereidsPlanner.plan(NereidsPlanner.java:125) 
~[doris-fe.jar:1.2-SNAPSHOT]
           at 
org.apache.doris.qe.StmtExecutor.executeByNereids(StmtExecutor.java:554) 
~[doris-fe.jar:1.2-SNAPSHOT]
           ... 9 more
   ```
   
   
   ### What You Expected?
   
   the hive external table can been queried normally.
   
   ### How to Reproduce?
   
   1. create hive external table
   2. try query it
   
   ### Anything Else?
   
   _No response_
   
   ### Are you willing to submit PR?
   
   - [ ] Yes I am willing to submit a PR!
   
   ### Code of Conduct
   
   - [X] I agree to follow this project's [Code of 
Conduct](https://www.apache.org/foundation/policies/conduct)
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to