Repository: spark
Updated Branches:
  refs/heads/branch-1.5 ed74d301a -> 86f9a3513


[SPARK-10495] [SQL] [BRANCH-1.5] Fix build.

Looks like 
https://github.com/apache/spark/commit/7ab4d17395e3dd71b53c1229d80ca1b3fbd1717b 
broke the 1.5 build.

Author: Yin Huai <[email protected]>

Closes #8861 from yhuai/fixBuild.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/86f9a351
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/86f9a351
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/86f9a351

Branch: refs/heads/branch-1.5
Commit: 86f9a351332bb3fa1398b9ef13e78b7407c5e120
Parents: ed74d30
Author: Yin Huai <[email protected]>
Authored: Mon Sep 21 21:05:51 2015 -0700
Committer: Reynold Xin <[email protected]>
Committed: Mon Sep 21 21:05:51 2015 -0700

----------------------------------------------------------------------
 .../apache/spark/sql/execution/datasources/json/JsonSuite.scala  | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/86f9a351/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
----------------------------------------------------------------------
diff --git 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
index 602c77c..83fd96d 100644
--- 
a/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
+++ 
b/sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/json/JsonSuite.scala
@@ -1238,10 +1238,10 @@ class JsonSuite extends QueryTest with SharedSQLContext 
with TestJsonData {
       val allJSON =
         existingJSONData ++
           df.toJSON.collect() ++
-          sparkContext.textFile(path.getCanonicalPath).collect()
+          sqlContext.sparkContext.textFile(path.getCanonicalPath).collect()
 
       Utils.deleteRecursively(path)
-      sparkContext.parallelize(allJSON, 
1).saveAsTextFile(path.getCanonicalPath)
+      sqlContext.sparkContext.parallelize(allJSON, 
1).saveAsTextFile(path.getCanonicalPath)
 
       // Read data back with the schema specified.
       val col0Values =


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to