spark git commit: [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 252eb6193 -> 29ace3bbf [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction. https://issues.apache.org/jira/browse/SPARK-9664 Author: Yin Huai Closes #7982 from yhuai/udafRegister and squashes the

spark git commit: [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 9270bd06f -> d5a9af323 [SPARK-9664] [SQL] Remove UDAFRegistration and add apply to UserDefinedAggregateFunction. https://issues.apache.org/jira/browse/SPARK-9664 Author: Yin Huai Closes #7982 from yhuai/udafRegister and squashes the fol

spark git commit: [SPARK-9674][SQL] Remove GeneratedAggregate.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 f24cd8cb9 -> 252eb6193 [SPARK-9674][SQL] Remove GeneratedAggregate. The new aggregate replaces the old GeneratedAggregate. Author: Reynold Xin Closes #7983 from rxin/remove-generated-agg and squashes the following commits: 8334aae [

spark git commit: [SPARK-9674][SQL] Remove GeneratedAggregate.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 119b59053 -> 9270bd06f [SPARK-9674][SQL] Remove GeneratedAggregate. The new aggregate replaces the old GeneratedAggregate. Author: Reynold Xin Closes #7983 from rxin/remove-generated-agg and squashes the following commits: 8334aae [Reyn

spark git commit: [SPARK-6923] [SPARK-7550] [SQL] Persists data source relations in Hive compatible format when possible

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master 4581badbc -> 119b59053 [SPARK-6923] [SPARK-7550] [SQL] Persists data source relations in Hive compatible format when possible This PR is a fork of PR #5733 authored by chenghao-intel. For committers who's going to merge this PR, please s

spark git commit: [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 4399b7b09 -> 4581badbc [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap This PR has the following three small fixes. 1. UnsafeKVExternalSorter does not use 0 as the initialSize to create an Unsaf

spark git commit: [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 eb2229ac0 -> f24cd8cb9 [SPARK-9611] [SQL] Fixes a few corner cases when we spill a UnsafeFixedWidthAggregationMap This PR has the following three small fixes. 1. UnsafeKVExternalSorter does not use 0 as the initialSize to create an U

spark git commit: [SPARK-9651] Fix UnsafeExternalSorterSuite.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 5f037b3dc -> eb2229ac0 [SPARK-9651] Fix UnsafeExternalSorterSuite. First, it's probably a bad idea to call generated Scala methods from Java. In this case, the method being called wasn't actually "Utils.createTempDir()", but actually th

spark git commit: [SPARK-9651] Fix UnsafeExternalSorterSuite.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 8c320e45b -> 4399b7b09 [SPARK-9651] Fix UnsafeExternalSorterSuite. First, it's probably a bad idea to call generated Scala methods from Java. In this case, the method being called wasn't actually "Utils.createTempDir()", but actually the me

spark git commit: [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 3b617e87c -> 5f037b3dc [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings JIRA: https://issues.apache.org/jira/browse/SPARK-6591 Author: Yijie Shen Closes #7926 from yjshen/py_dsload_opt

spark git commit: [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.4 369510c5a -> 10ea6fb4d [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings JIRA: https://issues.apache.org/jira/browse/SPARK-6591 Author: Yijie Shen Closes #7926 from yjshen/py_dsload_opt

spark git commit: [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master a018b8571 -> 8c320e45b [SPARK-6591] [SQL] Python data source load options should auto convert common types into strings JIRA: https://issues.apache.org/jira/browse/SPARK-6591 Author: Yijie Shen Closes #7926 from yjshen/py_dsload_opt and

spark git commit: [SPARK-5895] [ML] Add VectorSlicer - updated

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 618dc63e7 -> 3b617e87c [SPARK-5895] [ML] Add VectorSlicer - updated Add VectorSlicer transformer to spark.ml, with features specified as either indices or names. Transfers feature attributes for selected features. Updated version of

spark git commit: [SPARK-5895] [ML] Add VectorSlicer - updated

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master 9c878923d -> a018b8571 [SPARK-5895] [ML] Add VectorSlicer - updated Add VectorSlicer transformer to spark.ml, with features specified as either indices or names. Transfers feature attributes for selected features. Updated version of [htt

spark git commit: [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/branch-1.5 30e9fcfb3 -> 618dc63e7 [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ This patches renames `RowOrdering` to `InterpretedOrdering` and updates SortMergeJoin to use the `SparkPlan` methods for const

spark git commit: [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master dac090d1e -> 9c878923d [SPARK-9054] [SQL] Rename RowOrdering to InterpretedOrdering; use newOrdering in SMJ This patches renames `RowOrdering` to `InterpretedOrdering` and updates SortMergeJoin to use the `SparkPlan` methods for construct

spark git commit: [SPARK-9657] Fix return type of getMaxPatternLength

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 05cbf133d -> 30e9fcfb3 [SPARK-9657] Fix return type of getMaxPatternLength mengxr Author: Feynman Liang Closes #7974 from feynmanliang/SPARK-9657 and squashes the following commits: 7ca533f [Feynman Liang] Fix return type of getMaxP

spark git commit: [SPARK-9657] Fix return type of getMaxPatternLength

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master f9c2a2af1 -> dac090d1e [SPARK-9657] Fix return type of getMaxPatternLength mengxr Author: Feynman Liang Closes #7974 from feynmanliang/SPARK-9657 and squashes the following commits: 7ca533f [Feynman Liang] Fix return type of getMaxPatte

spark git commit: Closes #7474 since it's marked as won't fix.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 5f0fb6466 -> f9c2a2af1 Closes #7474 since it's marked as won't fix. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/f9c2a2af Tree: http://git-wip-us.apache.org/repos/as

spark git commit: [SPARK-9649] Fix flaky test MasterSuite - randomize ports

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 b8136d7e0 -> 05cbf133d [SPARK-9649] Fix flaky test MasterSuite - randomize ports ``` Error Message Failed to bind to: /127.0.0.1:7093: Service 'sparkMaster' failed after 16 retries! Stacktrace java.net.BindException: Failed to

spark git commit: [SPARK-9649] Fix flaky test MasterSuite - randomize ports

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master eb5b8f4a6 -> 5f0fb6466 [SPARK-9649] Fix flaky test MasterSuite - randomize ports ``` Error Message Failed to bind to: /127.0.0.1:7093: Service 'sparkMaster' failed after 16 retries! Stacktrace java.net.BindException: Failed to bind

spark git commit: Closes #7778 since it is done as #7893.

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master e1e05873f -> eb5b8f4a6 Closes #7778 since it is done as #7893. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/eb5b8f4a Tree: http://git-wip-us.apache.org/repos/asf/spa

spark git commit: [SPARK-9403] [SQL] Add codegen support in In and InSet

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 19018d542 -> b8136d7e0 [SPARK-9403] [SQL] Add codegen support in In and InSet This continues tarekauel's work in #7778. Author: Liang-Chi Hsieh Author: Tarek Auel Closes #7893 from viirya/codegen_in and squashes the following commit

spark git commit: [SPARK-9403] [SQL] Add codegen support in In and InSet

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master 1f8c364b9 -> e1e05873f [SPARK-9403] [SQL] Add codegen support in In and InSet This continues tarekauel's work in #7778. Author: Liang-Chi Hsieh Author: Tarek Auel Closes #7893 from viirya/codegen_in and squashes the following commits:

spark git commit: [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 03bcf627d -> 19018d542 [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920 This is a follow-up of https://github.com/apache/spark/pull/7920 to fix comments. Author: Yin Huai Closes #7964 from yhuai/SPARK-9141-follow-up and squashes

spark git commit: [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 7a969a696 -> 1f8c364b9 [SPARK-9141] [SQL] [MINOR] Fix comments of PR #7920 This is a follow-up of https://github.com/apache/spark/pull/7920 to fix comments. Author: Yin Huai Closes #7964 from yhuai/SPARK-9141-follow-up and squashes the

spark git commit: [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed

2015-08-05 Thread vanzin
Repository: spark Updated Branches: refs/heads/branch-1.5 125827a4f -> 03bcf627d [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in YarnClientSched

spark git commit: [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed

2015-08-05 Thread vanzin
Repository: spark Updated Branches: refs/heads/master 23d982204 -> 7a969a696 [SPARK-9519] [YARN] Confirm stop sc successfully when application was killed Currently, when we kill application on Yarn, then will call sc.stop() at Yarn application state monitor thread, then in YarnClientScheduler

spark git commit: [SPARK-9141] [SQL] Remove project collapsing from DataFrame API

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/branch-1.5 eedb996dd -> 125827a4f [SPARK-9141] [SQL] Remove project collapsing from DataFrame API Currently we collapse successive projections that are added by `withColumn`. However, this optimization violates the constraint that adding nodes t

spark git commit: [SPARK-9141] [SQL] Remove project collapsing from DataFrame API

2015-08-05 Thread yhuai
Repository: spark Updated Branches: refs/heads/master 34dcf1010 -> 23d982204 [SPARK-9141] [SQL] Remove project collapsing from DataFrame API Currently we collapse successive projections that are added by `withColumn`. However, this optimization violates the constraint that adding nodes to a

spark git commit: [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark.

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/master 519cf6d3f -> 34dcf1010 [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark. mengxr This adds the `BlockMatrix` to PySpark. I have the conversions to `IndexedRowMatrix` and `CoordinateMatrix` ready as well, so once PR #7554 is comple

spark git commit: [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark.

2015-08-05 Thread meng
Repository: spark Updated Branches: refs/heads/branch-1.5 350006497 -> eedb996dd [SPARK-6486] [MLLIB] [PYTHON] Add BlockMatrix to PySpark. mengxr This adds the `BlockMatrix` to PySpark. I have the conversions to `IndexedRowMatrix` and `CoordinateMatrix` ready as well, so once PR #7554 is co

spark git commit: [SPARK-9381] [SQL] Migrate JSON data source to the new partitioning data source

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master eb8bfa3ea -> 519cf6d3f [SPARK-9381] [SQL] Migrate JSON data source to the new partitioning data source Support partitioning for the JSON data source. Still 2 open issues for the `HadoopFsRelation` - `refresh()` will invoke the `discoveryPa

spark git commit: [SPARK-9618] [SQL] Use the specified schema when reading Parquet files

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master 70112ff22 -> eb8bfa3ea [SPARK-9618] [SQL] Use the specified schema when reading Parquet files The user specified schema is currently ignored when loading Parquet files. One workaround is to use the `format` and `load` methods instead of `p

spark git commit: [SPARK-9593] [SQL] Fixes Hadoop shims loading

2015-08-05 Thread lian
Repository: spark Updated Branches: refs/heads/master e27a8c4cb -> 70112ff22 [SPARK-9593] [SQL] Fixes Hadoop shims loading This PR is used to workaround CDH Hadoop versions like 2.0.0-mr1-cdh4.1.1. Internally, Hive `ShimLoader` tries to load different versions of Hadoop shims by checking ver

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.4 dea04bf84 -> 369510c5a [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be shutdo

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.5 93c166a91 -> 350006497 [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be shutdo

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/branch-1.3 cd5d1be6e -> 384793dff [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be shutdo

spark git commit: [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn

2015-08-05 Thread srowen
Repository: spark Updated Branches: refs/heads/master 26b06f1c4 -> e27a8c4cb [SPARK-9607] [SPARK-9608] fix zinc-port handling in build/mvn - pass `$ZINC_PORT` to zinc status/shutdown commands - fix path check that sets `$ZINC_INSTALL_FLAG`, which was incorrectly causing zinc to be shutdown a

spark git commit: [HOTFIX] Add static import to fix build break from #7676.

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/branch-1.5 f288cca3d -> 93c166a91 [HOTFIX] Add static import to fix build break from #7676. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/93c166a9 Tree: http://git-wip-us.ap

spark git commit: [HOTFIX] Add static import to fix build break from #7676.

2015-08-05 Thread joshrosen
Repository: spark Updated Branches: refs/heads/master 84ca3183b -> 26b06f1c4 [HOTFIX] Add static import to fix build break from #7676. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/26b06f1c Tree: http://git-wip-us.apache

spark git commit: [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1b0317f64 -> 84ca3183b [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability JIRA: https://issues.apache.org/jira/browse/SPARK-9628 Author: Yijie Shen Closes #7953 from yjshen/datetime_alias and squashes th

spark git commit: [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 ebc3aad27 -> f288cca3d [SPARK-9628][SQL]Rename int to SQLDate, long to SQLTimestamp for better readability JIRA: https://issues.apache.org/jira/browse/SPARK-9628 Author: Yijie Shen Closes #7953 from yjshen/datetime_alias and squashe

[2/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
[SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab This PR includes the following changes: ### SPARK-8862: Add basic instrumentation to each SparkPlan operator A SparkPlan can override `def accumulators: Map[String, Accumulator[_]]` to expo

[1/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 6306019ff -> ebc3aad27 http://git-wip-us.apache.org/repos/asf/spark/blob/ebc3aad2/sql/core/src/test/scala/org/apache/spark/sql/ui/SQLListenerSuite.scala -- diff --git a

[2/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
[SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab This PR includes the following changes: ### SPARK-8862: Add basic instrumentation to each SparkPlan operator A SparkPlan can override `def accumulators: Map[String, Accumulator[_]]` to expo

[1/2] spark git commit: [SPARK-8861][SPARK-8862][SQL] Add basic instrumentation to each SparkPlan operator and add a new SQL tab

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1bf608b5e -> 1b0317f64 http://git-wip-us.apache.org/repos/asf/spark/blob/1b0317f6/sql/core/src/test/scala/org/apache/spark/sql/ui/SQLListenerSuite.scala -- diff --git a/sql

spark git commit: [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/branch-1.5 7fa419535 -> 6306019ff [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc Author: Namit Katariya Closes #7935 from namitk/SPARK-9601 and squashes the following commits: 03b57

spark git commit: [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/master 6d8a6e416 -> 1bf608b5e [SPARK-9601] [DOCS] Fix JavaPairDStream signature for stream-stream and windowed join in streaming guide doc Author: Namit Katariya Closes #7935 from namitk/SPARK-9601 and squashes the following commits: 03b5784 [

spark git commit: [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/branch-1.5 57596fb8c -> 7fa419535 [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort The current implementation of UnsafeExternalSort uses NoOpPrefixComparator for binary-typed data. So, we need to add BinaryPrefix

spark git commit: [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort

2015-08-05 Thread davies
Repository: spark Updated Branches: refs/heads/master d8ef538e5 -> 6d8a6e416 [SPARK-9360] [SQL] Support BinaryType in PrefixComparators for UnsafeExternalSort The current implementation of UnsafeExternalSort uses NoOpPrefixComparator for binary-typed data. So, we need to add BinaryPrefixComp

spark git commit: Closes #7917

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master 1d1a76c8c -> d8ef538e5 Closes #7917 Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/d8ef538e Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/d8ef538e Diff: http

spark git commit: [SPARK-9581][SQL] Add unit test for JSON UDT

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/branch-1.5 ea23e54ff -> 57596fb8c [SPARK-9581][SQL] Add unit test for JSON UDT This brings #7416 up-to-date by drubbo. Author: Emiliano Leporati Author: Reynold Xin Closes #7917 from rxin/udt-json-test and squashes the following commits: 93e3

spark git commit: [SPARK-9581][SQL] Add unit test for JSON UDT

2015-08-05 Thread rxin
Repository: spark Updated Branches: refs/heads/master c2a71f071 -> 1d1a76c8c [SPARK-9581][SQL] Add unit test for JSON UDT This brings #7416 up-to-date by drubbo. Author: Emiliano Leporati Author: Reynold Xin Closes #7917 from rxin/udt-json-test and squashes the following commits: 93e3954

spark git commit: [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/master 781c8d71a -> c2a71f071 [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers This PR is the second one in the larger issue of making the Kinesis integration reliable and provide WAL-free at-least once g

spark git commit: [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers

2015-08-05 Thread tdas
Repository: spark Updated Branches: refs/heads/branch-1.5 b6e8446a4 -> ea23e54ff [SPARK-9217] [STREAMING] Make the kinesis receiver reliable by recording sequence numbers This PR is the second one in the larger issue of making the Kinesis integration reliable and provide WAL-free at-least on