This is an automated email from the ASF dual-hosted git repository.

zjffdu pushed a commit to branch branch-0.9
in repository https://gitbox.apache.org/repos/asf/zeppelin.git


The following commit(s) were added to refs/heads/branch-0.9 by this push:
     new 02d61d6  [ZEPPELIN-4962]. Support for manually specifying the Java 
version of Spark Interpreter Scala REPL and fix the CI failure due to low Scala 
version
02d61d6 is described below

commit 02d61d6d19aba2ec57cad40fabab56da8ea30e31
Author: xiejiajun <xiejiaju...@163.com>
AuthorDate: Tue Jul 21 19:52:14 2020 +0800

    [ZEPPELIN-4962]. Support for manually specifying the Java version of Spark 
Interpreter Scala REPL and fix the CI failure due to low Scala version
    
    ### What is this PR for?
    - fix the [CI 
failure](https://travis-ci.org/github/apache/zeppelin/builds/709913046) due to 
[PR-3852](https://github.com/apache/zeppelin/pull/3852)
    
    ### What type of PR is it?
    [Bug Fix]
    
    ### Todos
    * [ ] - Task
    
    ### What is the Jira issue?
    * 
[ZEPPELIN-4962](https://issues.apache.org/jira/projects/ZEPPELIN/issues/ZEPPELIN-4962)
    
    ### How should this be tested?
    * CI test
    
    ### Screenshots (if appropriate)
    
    ### Questions:
    * Does the licenses files need update? NO
    * Is there breaking changes for older versions? NO
    * Does this needs documentation? Yes
    
    Author: xiejiajun <xiejiaju...@163.com>
    Author: xie-jia-jun <xiejiaju...@163.com>
    Author: JakeXie <xiejiaju...@163.com>
    
    Closes #3860 from xiejiajun/ZEPPELIN-4962 and squashes the following 
commits:
    
    9128c9b28 [JakeXie] spark.repl.target docs update
    ad4c0e353 [xiejiajun] Clear irrelevant code
    a12d3a9c6 [xiejiajun] Support for manually specifying the Java version of 
Spark Interpreter Scala REPL and fix the CI failure due to low Scala version
    ab2b191b3 [xiejiajun] Merge branch 'master' of 
https://github.com/apache/zeppelin into apache-master
    5569788b3 [xiejiajun] Merge branch 'master' of 
https://github.com/apache/zeppelin into apache-master
    0a9af6cff [xiejiajun] Merge branch 'master' of 
https://github.com/apache/zeppelin into apache-master
    be36b371a [xiejiajun] 合并Apache Master分支冲突解决
    1335d55ba [xiejiajun] Merge remote-tracking branch 'origin/master'
    fc59f576c [JakeXie] Merge pull request #4 from apache/master
    9cc70fea3 [xiejiajun] Merge remote-tracking branch 'origin/master'
    6ef9b23c1 [xie-jia-jun] Merge pull request #3 from apache/master
    45af87a46 [xiejiajun] added timeout for getting Thrift client to avoid 
situations where the interpreter may not be restarted when the interpreter 
process exits unexpectedly
    f149c3b41 [xie-jia-jun] Merge pull request #1 from apache/master
    5d4b64598 [xie-jia-jun] Support OSSConfigStorage of Aliyun
    dbb6639a9 [xie-jia-jun] Add Aliyun OSS SDK
    bb4784915 [xie-jia-jun] Support S3ConfigStorage of AWS
    
    (cherry picked from commit 185ffd43e5410234deae6e34ca37046242675c9e)
    Signed-off-by: Jeff Zhang <zjf...@apache.org>
---
 docs/interpreter/spark.md                                   | 13 ++++++++++++-
 .../apache/zeppelin/spark/SparkScala210Interpreter.scala    |  3 +++
 .../apache/zeppelin/spark/SparkScala211Interpreter.scala    |  2 ++
 3 files changed, 17 insertions(+), 1 deletion(-)

diff --git a/docs/interpreter/spark.md b/docs/interpreter/spark.md
index 6a023bd..537ae60 100644
--- a/docs/interpreter/spark.md
+++ b/docs/interpreter/spark.md
@@ -200,10 +200,21 @@ You can also set other Spark properties which are not 
listed in the table. For a
       (ex: http://{{PORT}}-{{SERVICE_NAME}}.{{SERVICE_DOMAIN}})
      </td>
   </tr>
-  <td>spark.webui.yarn.useProxy</td>
+  <tr>
+    <td>spark.webui.yarn.useProxy</td>
     <td>false</td>
     <td>whether use yarn proxy url as spark weburl, e.g. 
http://localhost:8088/proxy/application_1583396598068_0004</td>
   </tr>
+  <tr>
+    <td>spark.repl.target</td>
+    <td>jvm-1.6</td>
+    <td>
+      Manually specifying the Java version of Spark Interpreter Scala 
REPL,Available options:<br/> 
+      scala-compile v2.10.7 to v2.11.12 supports "jvm-1.5, jvm-1.6, jvm-1.7 
and jvm-1.8", and the default value is jvm-1.6.<br/> 
+      scala-compile v2.10.1 to v2.10.6 supports "jvm-1.5, jvm-1.6, jvm-1.7", 
and the default value is jvm-1.6.<br/> 
+      scala-compile v2.12.x defaults to jvm-1.8, and only supports jvm-1.8.
+    </td>
+  </tr>
 </table>
 
 Without any configuration, Spark interpreter works out of box in local mode. 
But if you want to connect to your Spark cluster, you'll need to follow below 
two simple steps.
diff --git 
a/spark/scala-2.10/src/main/scala/org/apache/zeppelin/spark/SparkScala210Interpreter.scala
 
b/spark/scala-2.10/src/main/scala/org/apache/zeppelin/spark/SparkScala210Interpreter.scala
index 0eac200..c093636 100644
--- 
a/spark/scala-2.10/src/main/scala/org/apache/zeppelin/spark/SparkScala210Interpreter.scala
+++ 
b/spark/scala-2.10/src/main/scala/org/apache/zeppelin/spark/SparkScala210Interpreter.scala
@@ -67,10 +67,13 @@ class SparkScala210Interpreter(override val conf: SparkConf,
       sparkHttpServer = server
       conf.set("spark.repl.class.uri", uri)
     }
+    val target = conf.get("spark.repl.target", "jvm-1.6")
 
     val settings = new Settings()
     settings.embeddedDefaults(sparkInterpreterClassLoader)
     settings.usejavacp.value = true
+    settings.target.value = target
+
     this.userJars = getUserJars()
     LOGGER.info("UserJars: " + userJars.mkString(File.pathSeparator))
     settings.classpath.value = userJars.mkString(File.pathSeparator)
diff --git 
a/spark/scala-2.11/src/main/scala/org/apache/zeppelin/spark/SparkScala211Interpreter.scala
 
b/spark/scala-2.11/src/main/scala/org/apache/zeppelin/spark/SparkScala211Interpreter.scala
index cb5a016..3c3943b 100644
--- 
a/spark/scala-2.11/src/main/scala/org/apache/zeppelin/spark/SparkScala211Interpreter.scala
+++ 
b/spark/scala-2.11/src/main/scala/org/apache/zeppelin/spark/SparkScala211Interpreter.scala
@@ -66,12 +66,14 @@ class SparkScala211Interpreter(override val conf: SparkConf,
       sparkHttpServer = server
       conf.set("spark.repl.class.uri", uri)
     }
+    val target = conf.get("spark.repl.target", "jvm-1.6")
 
     val settings = new Settings()
     settings.processArguments(List("-Yrepl-class-based",
       "-Yrepl-outdir", s"${outputDir.getAbsolutePath}"), true)
     settings.embeddedDefaults(sparkInterpreterClassLoader)
     settings.usejavacp.value = true
+    settings.target.value = target
 
     this.userJars = getUserJars()
     LOGGER.info("UserJars: " + userJars.mkString(File.pathSeparator))

Reply via email to