svn commit: r31419 - in /dev/spark/3.0.0-SNAPSHOT-2018_12_06_20_38-5a140b7-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-12-06 Thread pwendell
Author: pwendell Date: Fri Dec 7 04:50:30 2018 New Revision: 31419 Log: Apache Spark 3.0.0-SNAPSHOT-2018_12_06_20_38-5a140b7 docs [This commit notification would consist of 1764 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] ---

spark git commit: [SPARK-26298][BUILD] Upgrade Janino to 3.0.11

2018-12-06 Thread dongjoon
Repository: spark Updated Branches: refs/heads/master 5a140b784 -> 477226520 [SPARK-26298][BUILD] Upgrade Janino to 3.0.11 ## What changes were proposed in this pull request? This PR aims to upgrade Janino compiler to the latest version 3.0.11. The followings are the changes from the [releas

spark git commit: [SPARK-26263][SQL] Validate partition values with user provided schema

2018-12-06 Thread wenchen
Repository: spark Updated Branches: refs/heads/master bfc5569a5 -> 5a140b784 [SPARK-26263][SQL] Validate partition values with user provided schema ## What changes were proposed in this pull request? Currently if user provides data schema, partition column values are converted as per it. But

spark git commit: [SPARK-26289][CORE] cleanup enablePerfMetrics parameter from BytesToBytesMap

2018-12-06 Thread wenchen
Repository: spark Updated Branches: refs/heads/master dbd90e544 -> bfc5569a5 [SPARK-26289][CORE] cleanup enablePerfMetrics parameter from BytesToBytesMap ## What changes were proposed in this pull request? `enablePerfMetrics `was originally designed in `BytesToBytesMap `to control `getNumHas

svn commit: r31417 - in /dev/spark/3.0.0-SNAPSHOT-2018_12_06_16_23-dbd90e5-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-12-06 Thread pwendell
Author: pwendell Date: Fri Dec 7 00:36:04 2018 New Revision: 31417 Log: Apache Spark 3.0.0-SNAPSHOT-2018_12_06_16_23-dbd90e5 docs [This commit notification would consist of 1764 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] ---

spark git commit: [SPARK-26194][K8S] Auto generate auth secret for k8s apps.

2018-12-06 Thread mcheah
Repository: spark Updated Branches: refs/heads/master b14a26ee5 -> dbd90e544 [SPARK-26194][K8S] Auto generate auth secret for k8s apps. This change modifies the logic in the SecurityManager to do two things: - generate unique app secrets also when k8s is being used - only store the secret in

svn commit: r31411 - in /dev/spark/3.0.0-SNAPSHOT-2018_12_06_12_05-b14a26e-docs: ./ _site/ _site/api/ _site/api/R/ _site/api/java/ _site/api/java/lib/ _site/api/java/org/ _site/api/java/org/apache/ _s

2018-12-06 Thread pwendell
Author: pwendell Date: Thu Dec 6 20:17:44 2018 New Revision: 31411 Log: Apache Spark 3.0.0-SNAPSHOT-2018_12_06_12_05-b14a26e docs [This commit notification would consist of 1764 parts, which exceeds the limit of 50 ones, so it was shortened to the summary.] ---

spark git commit: [SPARK-26236][SS] Add kafka delegation token support documentation.

2018-12-06 Thread vanzin
Repository: spark Updated Branches: refs/heads/master ecaa495b1 -> b14a26ee5 [SPARK-26236][SS] Add kafka delegation token support documentation. ## What changes were proposed in this pull request? Kafka delegation token support implemented in [PR#22598](https://github.com/apache/spark/pull/2

spark git commit: [SPARK-25274][PYTHON][SQL] In toPandas with Arrow send un-ordered record batches to improve performance

2018-12-06 Thread cutlerb
Repository: spark Updated Branches: refs/heads/master ab76900fe -> ecaa495b1 [SPARK-25274][PYTHON][SQL] In toPandas with Arrow send un-ordered record batches to improve performance ## What changes were proposed in this pull request? When executing `toPandas` with Arrow enabled, partitions th