This is an automated email from the ASF dual-hosted git repository. srowen pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push: new 272428d [SPARK-26600] Update spark-submit usage message 272428d is described below commit 272428db6fabf43ceaa7d0ee9297916e5f9b5c24 Author: Luca Canali <luca.can...@cern.ch> AuthorDate: Wed Jan 16 20:55:28 2019 -0600 [SPARK-26600] Update spark-submit usage message ## What changes were proposed in this pull request? Spark-submit usage message should be put in sync with recent changes in particular regarding K8S support. These are the proposed changes to the usage message: --executor-cores NUM -> can be useed for Spark on YARN and K8S --principal PRINCIPAL and --keytab KEYTAB -> can be used for Spark on YARN and K8S --total-executor-cores NUM> can be used for Spark standalone, YARN and K8S In addition this PR proposes to remove certain implementation details from the --keytab argument description as the implementation details vary between YARN and K8S, for example. ## How was this patch tested? Manually tested Closes #23518 from LucaCanali/updateSparkSubmitArguments. Authored-by: Luca Canali <luca.can...@cern.ch> Signed-off-by: Sean Owen <sean.o...@databricks.com> --- .../apache/spark/deploy/SparkSubmitArguments.scala | 23 +++++++++++----------- 1 file changed, 11 insertions(+), 12 deletions(-) diff --git a/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala b/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala index 34facd5..f5e4c4a 100644 --- a/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala +++ b/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala @@ -576,27 +576,26 @@ private[deploy] class SparkSubmitArguments(args: Seq[String], env: Map[String, S | --kill SUBMISSION_ID If given, kills the driver specified. | --status SUBMISSION_ID If given, requests the status of the driver specified. | - | Spark standalone and Mesos only: + | Spark standalone, Mesos and Kubernetes only: | --total-executor-cores NUM Total cores for all executors. | - | Spark standalone and YARN only: - | --executor-cores NUM Number of cores per executor. (Default: 1 in YARN mode, - | or all available cores on the worker in standalone mode) + | Spark standalone, YARN and Kubernetes only: + | --executor-cores NUM Number of cores used by each executor. (Default: 1 in + | YARN and K8S modes, or all available cores on the worker + | in standalone mode). | - | YARN-only: + | Spark on YARN and Kubernetes only: + | --principal PRINCIPAL Principal to be used to login to KDC. + | --keytab KEYTAB The full path to the file that contains the keytab for the + | principal specified above. + | + | Spark on YARN only: | --queue QUEUE_NAME The YARN queue to submit to (Default: "default"). | --num-executors NUM Number of executors to launch (Default: 2). | If dynamic allocation is enabled, the initial number of | executors will be at least NUM. | --archives ARCHIVES Comma separated list of archives to be extracted into the | working directory of each executor. - | --principal PRINCIPAL Principal to be used to login to KDC, while running on - | secure HDFS. - | --keytab KEYTAB The full path to the file that contains the keytab for the - | principal specified above. This keytab will be copied to - | the node running the Application Master via the Secure - | Distributed Cache, for renewing the login tickets and the - | delegation tokens periodically. """.stripMargin ) --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org