myloginid opened a new issue, #7388:
URL: https://github.com/apache/kyuubi/issues/7388

   ### Observed behavior
   * Running the Cloudera-built Kyuubi parcel on CDH 7.3.2.
   * `control.sh` resolves `SPARK_HOME` to 
`/opt/cloudera/parcels/CDH-7.3.2-1.cdh7.3.2.p0.77083870/lib/spark3` before 
`/opt/.../bin/kyuubi run` is executed (see 
`/var/run/cloudera-scm-agent/process/.../logs/stderr.log`).
   * The `spark-submit` command that Kyuubi launches still contains the literal 
template `{{SPARK_HOME}}/assembly/target/scala-2.12/jars` and immediately dies 
with `Failed to find Spark jars directory 
({{SPARK_HOME}}/assembly/target/scala-2.12/jars). You need to build Spark with 
the target "package" before running this program.`
   * `spark3-conf/spark-defaults.conf` and the engine log 
(`/var/lib/kyuubi/spark/kyuubi-spark-sql-engine.log.5`) both show only the 
placeholder path.
   
   ### Expected behavior
   * The placeholder should be replaced with the real parcel path 
(`/opt/cloudera/parcels/CDH-7.3.2-1.cdh7.3.2.p0.77083870/lib/spark3`), so the 
engine can find its `jars/` directory instead of expecting a source-tree 
assembly build.
   
   ### Reproduction steps
   1. Install the latest `KYUUBI-1.10.3-p1-SNAPSHOT` parcel in Cloudera Manager 
alongside CDH 7.3.2.
   2. Start the KYUUBI_SERVER role (Kerberos enabled); confirm 
`/var/run/cloudera-scm-agent/process/<pid>-kyuubi-KYUUBI_SERVER/conf/kyuubi-defaults.conf`
 lists `kerberos_auth_enable=true` and 
`spark.yarn.jars=local:/opt/.../lib/spark3/jars/*`.
   3. Run `kyuubi-beeline -u 
"jdbc:kyuubi://<host>:10009/default;kyuubiServerPrincipal=kyuubi/<host>@ROOT.COMOPS.SITE"`
 after `kinit`.
   4. Observe the backend log at 
`/var/lib/kyuubi/spark/kyuubi-spark-sql-engine.log.*` complaining about 
`{{SPARK_HOME}}/assembly/target/scala-2.12/jars`.
   
   ### Workaround
   * None; the engine needs to resolve the placeholder before launching.
   
   Could the templating used inside `SparkProcessBuilder`/`engine.spark` pick 
up the resolved `SPARK_HOME` value (from the environment or 
`spark-defaults.conf`) instead of leaving the literal `{{SPARK_HOME}}` in place?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to