HonestManXin opened a new pull request, #21881:
URL: https://github.com/apache/doris/pull/21881

   
   ## Proposed changes
   
   <!--Describe your changes.-->
   
   When creating a Hive external table for Spark loading, the Hive external 
table includes related information such as the Hive Metastore. However, when 
submitting a job, it is required to have the hive-site.xml file in the Spark 
conf directory; otherwise, the Spark job may fail with an error message 
indicating that the corresponding Hive table cannot be found.
   
   The SparkEtlJob.initSparkConfigs method sets the properties of the external 
table into the Spark conf. However, at this point, the Spark session has 
already been created, and the Hive-related parameters will not take effect. To 
ensure that the Spark Hive catalog properly loads Hive tables, you need to set 
the Hive-related parameters before creating the Spark session.


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to