Repository: spark
Updated Branches:
refs/heads/master 4172ff80d -> 4e35c5a3d
[SPARK-12970][DOCS] Fix the example in SturctType APIs for Scala and Java
## What changes were proposed in this pull request?
This PR fixes both,
javadoc8 break
```
[error]
.../spark/sql/hive/target/java/org/apache/spark/sql/hive/FindHiveSerdeTable.java:3:
error: reference not found
[error] * Replaces {link SimpleCatalogRelation} with {link MetastoreRelation}
if its table provider is hive.
```
and the example in `StructType` as a self-contained example as below:
```scala
import org.apache.spark.sql._
import org.apache.spark.sql.types._
val struct =
StructType(
StructField("a", IntegerType, true) ::
StructField("b", LongType, false) ::
StructField("c", BooleanType, false) :: Nil)
// Extract a single StructField.
val singleField = struct("b")
// singleField: StructField = StructField(b,LongType,false)
// If this struct does not have a field called "d", it throws an exception.
struct("d")
// java.lang.IllegalArgumentException: Field "d" does not exist.
// ...
// Extract multiple StructFields. Field names are provided in a set.
// A StructType object will be returned.
val twoFields = struct(Set("b", "c"))
// twoFields: StructType =
// StructType(StructField(b,LongType,false), StructField(c,BooleanType,false))
// Any names without matching fields will throw an exception.
// For the case shown below, an exception is thrown due to "d".
struct(Set("b", "c", "d"))
// java.lang.IllegalArgumentException: Field "d" does not exist.
// ...
```
```scala
import org.apache.spark.sql._
import org.apache.spark.sql.types._
val innerStruct =
StructType(
StructField("f1", IntegerType, true) ::
StructField("f2", LongType, false) ::
StructField("f3", BooleanType, false) :: Nil)
val struct = StructType(
StructField("a", innerStruct, true) :: Nil)
// Create a Row with the schema defined by struct
val row = Row(Row(1, 2, true))
```
Also, now when the column is missing, it throws an exception rather than
ignoring.
## How was this patch tested?
Manually via `sbt unidoc`.
- Scaladoc
<img width="665" alt="2017-01-26 12 54 13"
src="https://cloud.githubusercontent.com/assets/6477701/22297905/1245620e-e362-11e6-9e22-43bb8d9871af.png">
- Javadoc
<img width="722" alt="2017-01-26 12 54 27"
src="https://cloud.githubusercontent.com/assets/6477701/22297899/0fd87e0c-e362-11e6-9033-7590bda1aea6.png">
<img width="702" alt="2017-01-26 12 54 32"
src="https://cloud.githubusercontent.com/assets/6477701/22297900/0fe14154-e362-11e6-9882-768381c53163.png">
Author: hyukjinkwon <[email protected]>
Closes #16703 from HyukjinKwon/SPARK-12970.
Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/4e35c5a3
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/4e35c5a3
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/4e35c5a3
Branch: refs/heads/master
Commit: 4e35c5a3d39522ba54685b5c3370a11ba6dbb230
Parents: 4172ff8
Author: hyukjinkwon <[email protected]>
Authored: Fri Jan 27 10:06:54 2017 +0000
Committer: Sean Owen <[email protected]>
Committed: Fri Jan 27 10:06:54 2017 +0000
----------------------------------------------------------------------
.../org/apache/spark/sql/types/StructType.scala | 32 +++++++++++---------
.../apache/spark/sql/hive/HiveStrategies.scala | 2 +-
2 files changed, 18 insertions(+), 16 deletions(-)
----------------------------------------------------------------------
http://git-wip-us.apache.org/repos/asf/spark/blob/4e35c5a3/sql/catalyst/src/main/scala/org/apache/spark/sql/types/StructType.scala
----------------------------------------------------------------------
diff --git
a/sql/catalyst/src/main/scala/org/apache/spark/sql/types/StructType.scala
b/sql/catalyst/src/main/scala/org/apache/spark/sql/types/StructType.scala
index 0205c13..ca0000a 100644
--- a/sql/catalyst/src/main/scala/org/apache/spark/sql/types/StructType.scala
+++ b/sql/catalyst/src/main/scala/org/apache/spark/sql/types/StructType.scala
@@ -37,8 +37,9 @@ import org.apache.spark.util.Utils
* For a [[StructType]] object, one or multiple [[StructField]]s can be
extracted by names.
* If multiple [[StructField]]s are extracted, a [[StructType]] object will be
returned.
* If a provided name does not have a matching field, it will be ignored. For
the case
- * of extracting a single StructField, a `null` will be returned.
- * Example:
+ * of extracting a single [[StructField]], a `null` will be returned.
+ *
+ * Scala Example:
* {{{
* import org.apache.spark.sql._
* import org.apache.spark.sql.types._
@@ -53,28 +54,30 @@ import org.apache.spark.util.Utils
* val singleField = struct("b")
* // singleField: StructField = StructField(b,LongType,false)
*
- * // This struct does not have a field called "d". null will be returned.
- * val nonExisting = struct("d")
- * // nonExisting: StructField = null
+ * // If this struct does not have a field called "d", it throws an exception.
+ * struct("d")
+ * // java.lang.IllegalArgumentException: Field "d" does not exist.
+ * // ...
*
* // Extract multiple StructFields. Field names are provided in a set.
* // A StructType object will be returned.
* val twoFields = struct(Set("b", "c"))
* // twoFields: StructType =
- * // StructType(List(StructField(b,LongType,false),
StructField(c,BooleanType,false)))
+ * // StructType(StructField(b,LongType,false),
StructField(c,BooleanType,false))
*
- * // Any names without matching fields will be ignored.
- * // For the case shown below, "d" will be ignored and
- * // it is treated as struct(Set("b", "c")).
- * val ignoreNonExisting = struct(Set("b", "c", "d"))
- * // ignoreNonExisting: StructType =
- * // StructType(List(StructField(b,LongType,false),
StructField(c,BooleanType,false)))
+ * // Any names without matching fields will throw an exception.
+ * // For the case shown below, an exception is thrown due to "d".
+ * struct(Set("b", "c", "d"))
+ * // java.lang.IllegalArgumentException: Field "d" does not exist.
+ * // ...
* }}}
*
- * A [[org.apache.spark.sql.Row]] object is used as a value of the StructType.
- * Example:
+ * A [[org.apache.spark.sql.Row]] object is used as a value of the
[[StructType]].
+ *
+ * Scala Example:
* {{{
* import org.apache.spark.sql._
+ * import org.apache.spark.sql.types._
*
* val innerStruct =
* StructType(
@@ -87,7 +90,6 @@ import org.apache.spark.util.Utils
*
* // Create a Row with the schema defined by struct
* val row = Row(Row(1, 2, true))
- * // row: Row = [[1,2,true]]
* }}}
*
* @since 1.3.0
http://git-wip-us.apache.org/repos/asf/spark/blob/4e35c5a3/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveStrategies.scala
----------------------------------------------------------------------
diff --git
a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveStrategies.scala
b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveStrategies.scala
index 9a7111a..badccae 100644
--- a/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveStrategies.scala
+++ b/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveStrategies.scala
@@ -97,7 +97,7 @@ class HiveAnalysis(session: SparkSession) extends
Rule[LogicalPlan] {
}
/**
- * Replaces [[SimpleCatalogRelation]] with [[MetastoreRelation]] if its table
provider is hive.
+ * Replaces `SimpleCatalogRelation` with [[MetastoreRelation]] if its table
provider is hive.
*/
class FindHiveSerdeTable(session: SparkSession) extends Rule[LogicalPlan] {
override def apply(plan: LogicalPlan): LogicalPlan = plan transform {
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]