This is an automated email from the ASF dual-hosted git repository.
wenchen pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git
The following commit(s) were added to refs/heads/master by this push:
new c69117742b7 [SPARK-45491] Add missing SQLSTATES 2/2
c69117742b7 is described below
commit c69117742b7c05fe58ca2b94cd713231dc8f7a3d
Author: srielau <[email protected]>
AuthorDate: Mon Oct 16 14:15:16 2023 +0800
[SPARK-45491] Add missing SQLSTATES 2/2
### What changes were proposed in this pull request?
Complete addition of SQLSTATE's to all named error classes.
### Why are the changes needed?
We need SQLSTATEs top classify errors and catch them in JDBC/ODBC
### Does this PR introduce _any_ user-facing change?
Yes, SQLSTATE's are documented
### How was this patch tested?
Run existing QA
### Was this patch authored or co-authored using generative AI tooling?
No
Closes #43376 from srielau/SPARK-45491-tenp-errors-sqlstates-2.
Authored-by: srielau <[email protected]>
Signed-off-by: Wenchen Fan <[email protected]>
---
common/utils/src/main/resources/error/README.md | 14 +-
.../src/main/resources/error/error-classes.json | 174 ++++++++++++++-------
.../org/apache/spark/SparkThrowableSuite.scala | 4 +
...-conditions-unsupported-add-file-error-class.md | 2 +-
...itions-unsupported-default-value-error-class.md | 2 +-
...conditions-unsupported-generator-error-class.md | 2 +-
...or-conditions-unsupported-insert-error-class.md | 2 +-
...ions-unsupported-merge-condition-error-class.md | 2 +-
...conditions-unsupported-overwrite-error-class.md | 2 +-
...conditions-unsupported-save-mode-error-class.md | 2 +-
docs/sql-error-conditions.md | 126 ++++++++-------
.../spark/sql/errors/QueryExecutionErrors.scala | 2 +-
.../ceil-floor-with-scale-param.sql.out | 8 +-
.../sql-tests/analyzer-results/extract.sql.out | 4 +-
.../analyzer-results/group-analytics.sql.out | 14 +-
.../named-function-arguments.sql.out | 4 +-
.../postgreSQL/window_part3.sql.out | 2 +
.../table-valued-functions.sql.out | 7 +-
.../udf/udf-group-analytics.sql.out | 14 +-
.../results/ceil-floor-with-scale-param.sql.out | 8 +-
.../resources/sql-tests/results/extract.sql.out | 4 +-
.../sql-tests/results/group-analytics.sql.out | 14 +-
.../results/named-function-arguments.sql.out | 4 +-
.../results/postgreSQL/window_part3.sql.out | 2 +
.../results/table-valued-functions.sql.out | 7 +-
.../results/udf/udf-group-analytics.sql.out | 14 +-
26 files changed, 279 insertions(+), 161 deletions(-)
diff --git a/common/utils/src/main/resources/error/README.md
b/common/utils/src/main/resources/error/README.md
index cccc3402f1a..ac388c29250 100644
--- a/common/utils/src/main/resources/error/README.md
+++ b/common/utils/src/main/resources/error/README.md
@@ -868,6 +868,10 @@ The following SQLSTATEs are collated from:
|42K0D |42 |Syntax error or Access Rule violation |K0D
|Invalid lambda function |Spark |N
|Spark
|
|42K0E |42 |Syntax error or Access Rule violation |K0E
|An expression is not valid in teh context it is used |Spark |N
|Spark
|
|42K0F |42 |Syntax error or Access Rule violation |K0F |A
persisted object cannot reference a temporary object. |Spark |N
|Spark
|
+|42K0G |42 |Syntax error or Access Rule violation |K0G |A
protobuf is invalid |Spark |N
|Spark
|
+|42K0H |42 |Syntax error or Access Rule violation |K0H |A
cyclic invocation has been detected. |Spark |N
|Spark
|
+|42K0I |42 |Syntax error or Access Rule violation |K0I
|SQL Config not found. |Spark |N
|Spark
|
+|42K0J |42 |Syntax error or Access Rule violation |K0J
|Property not found. |Spark |N
|Spark
|
|42KD0 |42 |Syntax error or Access Rule violation |KD0
|Ambiguous name reference. |Databricks |N
|Databricks
|
|42KD1 |42 |Syntax error or Access Rule violation |KD1
|Operation not supported in READ ONLY session mode. |Databricks |N
|Databricks
|
|42KD2 |42 |Syntax error or Access Rule violation |KD2
|The source and target table names of a SYNC operaton must be the
same.|Databricks |N |Databricks
|
@@ -1013,6 +1017,8 @@ The following SQLSTATEs are collated from:
|54058 |54 |SQL or Product Limit Exceeded |058
|The internal representation of an XML path is too long. |DB2 |N
|DB2
|
|54065 |54 |SQL or Product Limit Exceeded |065
|The maximum of 99999 implicitly generated object names has been exceeded.|DB2
|N |DB2
|
|54068 |54 |SQL or Product Limit Exceeded |068
|Seamless automatic client reroute retry limit exceeded. |DB2 |N
|DB2
|
+|54K00 |54 |SQL or Product Limit Exceeded |K00
|Maximum depth of nested views was exceeded. |Spark |N
|Spark
|
+|54KD0 |54 |SQL or Product Limit Exceeded |KD0
|Maximum UDF count in query plan exceeded. |Databricks |N
|Databricks
|
|55000 |55 |Object Not In Prerequisite State |000
|object_not_in_prerequisite_state |PostgreSQL |N
|PostgreSQL Redshift
|
|55002 |55 |Object Not in Prerequisite State |002
|The explanation table is not defined properly. |DB2 |N
|DB2
|
|55003 |55 |Object Not in Prerequisite State |003
|The DDL registration table is not defined properly. |DB2 |N
|DB2
|
@@ -1135,7 +1141,7 @@ The following SQLSTATEs are collated from:
|57P03 |57 |Operator Intervention |P03
|cannot_connect_now |PostgreSQL |N
|PostgreSQL Redshift
|
|57P04 |57 |Operator Intervention |P04
|database_dropped |PostgreSQL |N
|PostgreSQL
|
|57P05 |57 |Operator Intervention |P05
|idle_session_timeout |PostgreSQL |N
|PostgreSQL
|
-|58000 |58 |System Error (error external to PostgreSQL itself)|000
|system_error |PostgreSQL |N
|PostgreSQL Redshift
|
+|58000 |58 |System Error (error external to PostgreSQL itself)|000
|System error |PostgreSQL |N
|PostgreSQL Redshift
|
|58001 |58 |System Error |001
|The database cannot be created, because the assigned DBID is a duplicate.|DB2
|N |DB2
|
|58002 |58 |System Error |002
|An exit has returned an error or invalid data. |DB2 |N
|DB2
|
|58003 |58 |System Error |003
|An invalid section number was detected. |DB2 |N
|DB2
|
@@ -1153,9 +1159,9 @@ The following SQLSTATEs are collated from:
|58017 |58 |System Error |017
|The DDM parameter value is not supported. |DB2 |N
|DB2
|
|58018 |58 |System Error |018
|The DDM reply message is not supported. |DB2 |N
|DB2
|
|58026 |58 |System Error |026
|The number of variables in the statement is not equal to the number of
variables in SQLSTTVRB.|DB2 |N |DB2
|
-|58030 |58 |System Error (error external to PostgreSQL itself)|030
|io_error |PostgreSQL |N
|PostgreSQL Redshift
|
-|58P01 |58 |System Error (error external to PostgreSQL itself)|P01
|undefined_file |PostgreSQL |N
|PostgreSQL Redshift
|
-|58P02 |58 |System Error (error external to PostgreSQL itself)|P02
|duplicate_file |PostgreSQL |N
|PostgreSQL Redshift
|
+|58030 |58 |System Error (error external to PostgreSQL itself)|030
|I/O error |PostgreSQL |N
|PostgreSQL Redshift
|
+|58P01 |58 |System Error (error external to PostgreSQL itself)|P01
|Undefined file |PostgreSQL |N
|PostgreSQL Redshift
|
+|58P02 |58 |System Error (error external to PostgreSQL itself)|P02
|Duplicate file |PostgreSQL |N
|PostgreSQL Redshift
|
|5UA01 |5U |Common Utilities and Tools |A01
|The task cannot be removed because it is currently executing.|DB2
|N |DB2
|
|60000 |60 |system error |000
|system error |Oracle |N
|Oracle
|
|61000 |61 |shared server and detached process errors |000
|shared server and detached process errors |Oracle |N
|Oracle
|
diff --git a/common/utils/src/main/resources/error/error-classes.json
b/common/utils/src/main/resources/error/error-classes.json
index 147949901ca..7350d0331c4 100644
--- a/common/utils/src/main/resources/error/error-classes.json
+++ b/common/utils/src/main/resources/error/error-classes.json
@@ -2402,7 +2402,7 @@
"message" : [
"The function <funcName> requires the parameter <paramName> to be a
foldable expression of the type <paramType>, but the actual argument is a
non-foldable."
],
- "sqlState" : "22024"
+ "sqlState" : "42K08"
},
"NON_LAST_MATCHED_CLAUSE_OMIT_CONDITION" : {
"message" : [
@@ -2686,52 +2686,62 @@
"PROTOBUF_DEPENDENCY_NOT_FOUND" : {
"message" : [
"Could not find dependency: <dependencyName>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_DESCRIPTOR_FILE_NOT_FOUND" : {
"message" : [
"Error reading Protobuf descriptor file at path: <filePath>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_FIELD_MISSING" : {
"message" : [
"Searching for <field> in Protobuf schema at <protobufSchema> gave
<matchSize> matches. Candidates: <matches>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_FIELD_MISSING_IN_SQL_SCHEMA" : {
"message" : [
"Found <field> in Protobuf schema but there is no match in the SQL
schema."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_FIELD_TYPE_MISMATCH" : {
"message" : [
"Type mismatch encountered for field: <field>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_MESSAGE_NOT_FOUND" : {
"message" : [
"Unable to locate Message <messageName> in Descriptor."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"PROTOBUF_TYPE_NOT_SUPPORT" : {
"message" : [
"Protobuf type not yet supported: <protobufType>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"RECURSIVE_PROTOBUF_SCHEMA" : {
"message" : [
"Found recursive reference in Protobuf schema, which can not be
processed by Spark by default: <fieldDescriptor>. try setting the option
`recursive.fields.max.depth` 0 to 10. Going beyond 10 levels of recursion is
not allowed."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"RECURSIVE_VIEW" : {
"message" : [
"Recursive view <viewIdent> detected (cycle: <newPath>)."
- ]
+ ],
+ "sqlState" : "42K0H"
},
"REF_DEFAULT_VALUE_IS_NOT_ALLOWED_IN_PARTITION" : {
"message" : [
"References to DEFAULT column values are not allowed within the
PARTITION clause."
- ]
+ ],
+ "sqlState" : "42601"
},
"RENAME_SRC_PATH_NOT_FOUND" : {
"message" : [
@@ -2786,8 +2796,10 @@
},
"SCALAR_SUBQUERY_IS_IN_GROUP_BY_OR_AGGREGATE_FUNCTION" : {
"message" : [
- "The correlated scalar subquery '<sqlExpr>' is neither present in GROUP
BY, nor in an aggregate function. Add it to GROUP BY using ordinal position or
wrap it in `first()` (or `first_value`) if you don't care which value you get."
- ]
+ "The correlated scalar subquery '<sqlExpr>' is neither present in GROUP
BY, nor in an aggregate function.",
+ "Add it to GROUP BY using ordinal position or wrap it in `first()` (or
`first_value`) if you don't care which value you get."
+ ],
+ "sqlState" : "0A000"
},
"SCALAR_SUBQUERY_TOO_MANY_ROWS" : {
"message" : [
@@ -2826,27 +2838,36 @@
"SEED_EXPRESSION_IS_UNFOLDABLE" : {
"message" : [
"The seed expression <seedExpr> of the expression <exprWithSeed> must be
foldable."
- ]
+ ],
+ "sqlState" : "42K08"
},
"SORT_BY_WITHOUT_BUCKETING" : {
"message" : [
"sortBy must be used together with bucketBy."
- ]
+ ],
+ "sqlState" : "42601"
},
"SPECIFY_BUCKETING_IS_NOT_ALLOWED" : {
"message" : [
- "Cannot specify bucketing information if the table schema is not
specified when creating and will be inferred at runtime."
- ]
+ "A CREATE TABLE without explicit column list cannot specify bucketing
information.",
+ "Please use the form with explicit column list and specify bucketing
information.",
+ "Alternatively, allow bucketing information to be inferred by omitting
the clause."
+ ],
+ "sqlState" : "42601"
},
"SPECIFY_PARTITION_IS_NOT_ALLOWED" : {
"message" : [
- "It is not allowed to specify partition columns when the table schema is
not defined. When the table schema is not provided, schema and partition
columns will be inferred."
- ]
+ "A CREATE TABLE without explicit column list cannot specify PARTITIONED
BY.",
+ "Please use the form with explicit column list and specify PARTITIONED
BY.",
+ "Alternatively, allow partitioning to be inferred by omitting the
PARTITION BY clause."
+ ],
+ "sqlState" : "42601"
},
"SQL_CONF_NOT_FOUND" : {
"message" : [
"The SQL config <sqlConf> cannot be found. Please verify that the config
exists."
- ]
+ ],
+ "sqlState" : "42K0I"
},
"STAR_GROUP_BY_POS" : {
"message" : [
@@ -2857,17 +2878,20 @@
"STATIC_PARTITION_COLUMN_IN_INSERT_COLUMN_LIST" : {
"message" : [
"Static partition column <staticName> is also specified in the column
list."
- ]
+ ],
+ "sqlState" : "42713"
},
"STREAM_FAILED" : {
"message" : [
"Query [id = <id>, runId = <runId>] terminated with exception: <message>"
- ]
+ ],
+ "sqlState" : "XXKST"
},
"SUM_OF_LIMIT_AND_OFFSET_EXCEEDS_MAX_INT" : {
"message" : [
"The sum of the LIMIT clause and the OFFSET clause must not be greater
than the maximum 32-bit integer value (2,147,483,647) but found limit =
<limit>, offset = <offset>."
- ]
+ ],
+ "sqlState" : "22003"
},
"TABLE_OR_VIEW_ALREADY_EXISTS" : {
"message" : [
@@ -2887,7 +2911,8 @@
"TABLE_VALUED_FUNCTION_FAILED_TO_ANALYZE_IN_PYTHON" : {
"message" : [
"Failed to analyze the Python user defined table function: <msg>"
- ]
+ ],
+ "sqlState" : "38000"
},
"TABLE_VALUED_FUNCTION_REQUIRED_METADATA_INCOMPATIBLE_WITH_CALL" : {
"message" : [
@@ -2903,13 +2928,17 @@
},
"TABLE_VALUED_FUNCTION_TOO_MANY_TABLE_ARGUMENTS" : {
"message" : [
- "There are too many table arguments for table-valued function. It allows
one table argument, but got: <num>. If you want to allow it, please set
\"spark.sql.allowMultipleTableArguments.enabled\" to \"true\""
- ]
+ "There are too many table arguments for table-valued function.",
+ "It allows one table argument, but got: <num>.",
+ "If you want to allow it, please set
\"spark.sql.allowMultipleTableArguments.enabled\" to \"true\""
+ ],
+ "sqlState" : "54023"
},
"TASK_WRITE_FAILED" : {
"message" : [
"Task failed while writing rows to <path>."
- ]
+ ],
+ "sqlState" : "58030"
},
"TEMP_TABLE_OR_VIEW_ALREADY_EXISTS" : {
"message" : [
@@ -2932,8 +2961,11 @@
},
"UDTF_ALIAS_NUMBER_MISMATCH" : {
"message" : [
- "The number of aliases supplied in the AS clause does not match the
number of columns output by the UDTF. Expected <aliasesSize> aliases, but got
<aliasesNames>. Please ensure that the number of aliases provided matches the
number of columns output by the UDTF."
- ]
+ "The number of aliases supplied in the AS clause does not match the
number of columns output by the UDTF.",
+ "Expected <aliasesSize> aliases, but got <aliasesNames>.",
+ "Please ensure that the number of aliases provided matches the number of
columns output by the UDTF."
+ ],
+ "sqlState" : "42802"
},
"UNABLE_TO_ACQUIRE_MEMORY" : {
"message" : [
@@ -2944,7 +2976,14 @@
"UNABLE_TO_CONVERT_TO_PROTOBUF_MESSAGE_TYPE" : {
"message" : [
"Unable to convert SQL type <toType> to Protobuf type <protobufType>."
- ]
+ ],
+ "sqlState" : "42K0G"
+ },
+ "UNABLE_TO_FETCH_HIVE_TABLES" : {
+ "message" : [
+ "Unable to fetch tables of Hive database: <dbName>."
+ ],
+ "sqlState" : "58030"
},
"UNABLE_TO_INFER_SCHEMA" : {
"message" : [
@@ -2979,7 +3018,8 @@
"UNKNOWN_PROTOBUF_MESSAGE_TYPE" : {
"message" : [
"Attempting to treat <descriptorName> as a Message, but it was
<containingType>."
- ]
+ ],
+ "sqlState" : "42K0G"
},
"UNPIVOT_REQUIRES_ATTRIBUTES" : {
"message" : [
@@ -3019,8 +3059,12 @@
},
"UNRESOLVABLE_TABLE_VALUED_FUNCTION" : {
"message" : [
- "Could not resolve <name> to a table-valued function. Please make sure
that <name> is defined as a table-valued function and that all required
parameters are provided correctly. If <name> is not defined, please create the
table-valued function before using it. For more information about defining
table-valued functions, please refer to the Apache Spark documentation."
- ]
+ "Could not resolve <name> to a table-valued function.",
+ "Please make sure that <name> is defined as a table-valued function and
that all required parameters are provided correctly.",
+ "If <name> is not defined, please create the table-valued function
before using it.",
+ "For more information about defining table-valued functions, please
refer to the Apache Spark documentation."
+ ],
+ "sqlState" : "42883"
},
"UNRESOLVED_ALL_IN_GROUP_BY" : {
"message" : [
@@ -3103,7 +3147,8 @@
"UNSET_NONEXISTENT_PROPERTIES" : {
"message" : [
"Attempted to unset non-existent properties [<properties>] in table
<table>."
- ]
+ ],
+ "sqlState" : "42K0J"
},
"UNSUPPORTED_ADD_FILE" : {
"message" : [
@@ -3120,7 +3165,8 @@
"The local directory <path> is not supported in a non-local master
mode."
]
}
- }
+ },
+ "sqlState" : "0A000"
},
"UNSUPPORTED_ARROWTYPE" : {
"message" : [
@@ -3130,13 +3176,16 @@
},
"UNSUPPORTED_CHAR_OR_VARCHAR_AS_STRING" : {
"message" : [
- "The char/varchar type can't be used in the table schema. If you want
Spark treat them as string type as same as Spark 3.0 and earlier, please set
\"spark.sql.legacy.charVarcharAsString\" to \"true\"."
- ]
+ "The char/varchar type can't be used in the table schema.",
+ "If you want Spark treat them as string type as same as Spark 3.0 and
earlier, please set \"spark.sql.legacy.charVarcharAsString\" to \"true\"."
+ ],
+ "sqlState" : "0A000"
},
"UNSUPPORTED_DATASOURCE_FOR_DIRECT_QUERY" : {
"message" : [
"Unsupported data source type for direct query on files:
<dataSourceType>"
- ]
+ ],
+ "sqlState" : "0A000"
},
"UNSUPPORTED_DATATYPE" : {
"message" : [
@@ -3147,7 +3196,8 @@
"UNSUPPORTED_DATA_TYPE_FOR_DATASOURCE" : {
"message" : [
"The <format> datasource doesn't support the column <columnName> of the
type <columnType>."
- ]
+ ],
+ "sqlState" : "0A000"
},
"UNSUPPORTED_DEFAULT_VALUE" : {
"message" : [
@@ -3164,7 +3214,8 @@
"Enable it by setting \"spark.sql.defaultColumn.enabled\" to
\"true\"."
]
}
- }
+ },
+ "sqlState" : "0A000"
},
"UNSUPPORTED_DESERIALIZER" : {
"message" : [
@@ -3187,13 +3238,16 @@
"UNSUPPORTED_EXPRESSION_GENERATED_COLUMN" : {
"message" : [
"Cannot create generated column <fieldName> with generation expression
<expressionStr> because <reason>."
- ]
+ ],
+ "sqlState" : "42621"
},
"UNSUPPORTED_EXPR_FOR_OPERATOR" : {
"message" : [
- "A query operator contains one or more unsupported expressions. Consider
to rewrite it to avoid window functions, aggregate functions, and generator
functions in the WHERE clause.",
+ "A query operator contains one or more unsupported expressions.",
+ "Consider to rewrite it to avoid window functions, aggregate functions,
and generator functions in the WHERE clause.",
"Invalid expressions: [<invalidExprSqls>]"
- ]
+ ],
+ "sqlState" : "42K0E"
},
"UNSUPPORTED_EXPR_FOR_WINDOW" : {
"message" : [
@@ -3455,12 +3509,13 @@
]
}
},
- "sqlState" : "0A000"
+ "sqlState" : "42K0E"
},
"UNSUPPORTED_GROUPING_EXPRESSION" : {
"message" : [
"grouping()/grouping_id() can only be used with
GroupingSets/Cube/Rollup."
- ]
+ ],
+ "sqlState" : "42K0E"
},
"UNSUPPORTED_INSERT" : {
"message" : [
@@ -3487,7 +3542,8 @@
"The target relation <relationId> is also being read from."
]
}
- }
+ },
+ "sqlState" : "42809"
},
"UNSUPPORTED_MERGE_CONDITION" : {
"message" : [
@@ -3509,7 +3565,8 @@
"Subqueries are not allowed: <cond>."
]
}
- }
+ },
+ "sqlState" : "42K0E"
},
"UNSUPPORTED_OVERWRITE" : {
"message" : [
@@ -3526,7 +3583,8 @@
"The target table is <table>."
]
}
- }
+ },
+ "sqlState" : "42902"
},
"UNSUPPORTED_SAVE_MODE" : {
"message" : [
@@ -3543,7 +3601,8 @@
"a non-existent path."
]
}
- }
+ },
+ "sqlState" : "0A000"
},
"UNSUPPORTED_SUBQUERY_EXPRESSION_CATEGORY" : {
"message" : [
@@ -3678,7 +3737,8 @@
"message" : [
"The depth of view <viewName> exceeds the maximum view resolution depth
(<maxNestedDepth>).",
"Analysis is aborted to avoid errors. If you want to work around this,
please try to increase the value of \"spark.sql.view.maxNestedViewDepth\"."
- ]
+ ],
+ "sqlState" : "54K00"
},
"VIEW_NOT_FOUND" : {
"message" : [
@@ -3691,17 +3751,20 @@
"WINDOW_FUNCTION_AND_FRAME_MISMATCH" : {
"message" : [
"<funcName> function can only be evaluated in an ordered row-based
window frame with a single offset: <windowExpr>."
- ]
+ ],
+ "sqlState" : "42K0E"
},
"WINDOW_FUNCTION_WITHOUT_OVER_CLAUSE" : {
"message" : [
"Window function <funcName> requires an OVER clause."
- ]
+ ],
+ "sqlState" : "42601"
},
"WRITE_STREAM_NOT_ALLOWED" : {
"message" : [
"`writeStream` can be called only on streaming Dataset/DataFrame."
- ]
+ ],
+ "sqlState" : "42601"
},
"WRONG_COMMAND_FOR_OBJECT_TYPE" : {
"message" : [
@@ -5843,11 +5906,6 @@
"<cnf> when creating Hive client using classpath: <execJars> Please make
sure that jars for your version of hive and hadoop are included in the paths
passed to <key>."
]
},
- "_LEGACY_ERROR_TEMP_2196" : {
- "message" : [
- "Unable to fetch tables of db <dbName>."
- ]
- },
"_LEGACY_ERROR_TEMP_2197" : {
"message" : [
"LOCATION clause illegal for view partition."
diff --git a/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
b/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
index 4ef4aa1fe4a..bb9fe79071c 100644
--- a/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
+++ b/core/src/test/scala/org/apache/spark/SparkThrowableSuite.scala
@@ -71,6 +71,9 @@ class SparkThrowableSuite extends SparkFunSuite {
def checkCondition(ss: Seq[String], fx: String => Boolean): Unit = {
ss.foreach { s =>
+ if (!fx(s)) {
+ print(s)
+ }
assert(fx(s))
}
}
@@ -545,6 +548,7 @@ class SparkThrowableSuite extends SparkFunSuite {
"""{
| "errorClass" : "UNSUPPORTED_SAVE_MODE.EXISTENT_PATH",
| "messageTemplate" : "The save mode <saveMode> is not supported for:
an existent path.",
+ | "sqlState" : "0A000",
| "messageParameters" : {
| "saveMode" : "UNSUPPORTED_MODE"
| }
diff --git a/docs/sql-error-conditions-unsupported-add-file-error-class.md
b/docs/sql-error-conditions-unsupported-add-file-error-class.md
index 6017e8505dd..80644e9c184 100644
--- a/docs/sql-error-conditions-unsupported-add-file-error-class.md
+++ b/docs/sql-error-conditions-unsupported-add-file-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
Don't support add file.
diff --git a/docs/sql-error-conditions-unsupported-default-value-error-class.md
b/docs/sql-error-conditions-unsupported-default-value-error-class.md
index 54597d3bc7e..c8a82f13ddd 100644
--- a/docs/sql-error-conditions-unsupported-default-value-error-class.md
+++ b/docs/sql-error-conditions-unsupported-default-value-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
DEFAULT column values is not supported.
diff --git a/docs/sql-error-conditions-unsupported-generator-error-class.md
b/docs/sql-error-conditions-unsupported-generator-error-class.md
index 7960c14767d..19150dcf472 100644
--- a/docs/sql-error-conditions-unsupported-generator-error-class.md
+++ b/docs/sql-error-conditions-unsupported-generator-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
The generator is not supported:
diff --git a/docs/sql-error-conditions-unsupported-insert-error-class.md
b/docs/sql-error-conditions-unsupported-insert-error-class.md
index abd6145f6c1..aafe616d069 100644
--- a/docs/sql-error-conditions-unsupported-insert-error-class.md
+++ b/docs/sql-error-conditions-unsupported-insert-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
42809](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Can't insert into the target.
diff --git
a/docs/sql-error-conditions-unsupported-merge-condition-error-class.md
b/docs/sql-error-conditions-unsupported-merge-condition-error-class.md
index a24f0c9db97..d55c2063a36 100644
--- a/docs/sql-error-conditions-unsupported-merge-condition-error-class.md
+++ b/docs/sql-error-conditions-unsupported-merge-condition-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
MERGE operation contains unsupported `<condName>` condition.
diff --git a/docs/sql-error-conditions-unsupported-overwrite-error-class.md
b/docs/sql-error-conditions-unsupported-overwrite-error-class.md
index 42fbc071ab4..467150e0ad1 100644
--- a/docs/sql-error-conditions-unsupported-overwrite-error-class.md
+++ b/docs/sql-error-conditions-unsupported-overwrite-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
42902](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Can't overwrite the target that is also being read from.
diff --git a/docs/sql-error-conditions-unsupported-save-mode-error-class.md
b/docs/sql-error-conditions-unsupported-save-mode-error-class.md
index 25bd31527e8..a5a3da2adf3 100644
--- a/docs/sql-error-conditions-unsupported-save-mode-error-class.md
+++ b/docs/sql-error-conditions-unsupported-save-mode-error-class.md
@@ -19,7 +19,7 @@ license: |
limitations under the License.
---
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
The save mode `<saveMode>` is not supported for:
diff --git a/docs/sql-error-conditions.md b/docs/sql-error-conditions.md
index 7cc897aee0d..78904780054 100644
--- a/docs/sql-error-conditions.md
+++ b/docs/sql-error-conditions.md
@@ -1376,7 +1376,7 @@ It is not allowed to use an aggregate function in the
argument of another aggreg
### NON_FOLDABLE_ARGUMENT
-[SQLSTATE: 22024](sql-error-conditions-sqlstates.html#class-22-data-exception)
+[SQLSTATE:
42K08](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
The function `<funcName>` requires the parameter `<paramName>` to be a
foldable expression of the type `<paramType>`, but the actual argument is a
non-foldable.
@@ -1611,61 +1611,61 @@ Rule `<rule>` in batch `<batch>` generated an invalid
plan: `<reason>`
### PROTOBUF_DEPENDENCY_NOT_FOUND
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Could not find dependency: `<dependencyName>`.
### PROTOBUF_DESCRIPTOR_FILE_NOT_FOUND
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Error reading Protobuf descriptor file at path: `<filePath>`.
### PROTOBUF_FIELD_MISSING
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Searching for `<field>` in Protobuf schema at `<protobufSchema>` gave
`<matchSize>` matches. Candidates: `<matches>`.
### PROTOBUF_FIELD_MISSING_IN_SQL_SCHEMA
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Found `<field>` in Protobuf schema but there is no match in the SQL schema.
### PROTOBUF_FIELD_TYPE_MISMATCH
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Type mismatch encountered for field: `<field>`.
### PROTOBUF_MESSAGE_NOT_FOUND
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Unable to locate Message `<messageName>` in Descriptor.
### PROTOBUF_TYPE_NOT_SUPPORT
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Protobuf type not yet supported: `<protobufType>`.
### RECURSIVE_PROTOBUF_SCHEMA
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Found recursive reference in Protobuf schema, which can not be processed by
Spark by default: `<fieldDescriptor>`. try setting the option
`recursive.fields.max.depth` 0 to 10. Going beyond 10 levels of recursion is
not allowed.
### RECURSIVE_VIEW
-SQLSTATE: none assigned
+[SQLSTATE:
42K0H](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Recursive view `<viewIdent>` detected (cycle: `<newPath>`).
### REF_DEFAULT_VALUE_IS_NOT_ALLOWED_IN_PARTITION
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
References to DEFAULT column values are not allowed within the PARTITION
clause.
@@ -1722,9 +1722,10 @@ Not found an id for the rule name "`<ruleName>`". Please
modify RuleIdCollection
### SCALAR_SUBQUERY_IS_IN_GROUP_BY_OR_AGGREGATE_FUNCTION
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
-The correlated scalar subquery '`<sqlExpr>`' is neither present in GROUP BY,
nor in an aggregate function. Add it to GROUP BY using ordinal position or wrap
it in `first()` (or `first_value`) if you don't care which value you get.
+The correlated scalar subquery '`<sqlExpr>`' is neither present in GROUP BY,
nor in an aggregate function.
+Add it to GROUP BY using ordinal position or wrap it in `first()` (or
`first_value`) if you don't care which value you get.
### SCALAR_SUBQUERY_TOO_MANY_ROWS
@@ -1762,31 +1763,35 @@ The second argument of `<functionName>` function needs
to be an integer.
### SEED_EXPRESSION_IS_UNFOLDABLE
-SQLSTATE: none assigned
+[SQLSTATE:
42K08](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
The seed expression `<seedExpr>` of the expression `<exprWithSeed>` must be
foldable.
### SORT_BY_WITHOUT_BUCKETING
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
sortBy must be used together with bucketBy.
### SPECIFY_BUCKETING_IS_NOT_ALLOWED
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
-Cannot specify bucketing information if the table schema is not specified when
creating and will be inferred at runtime.
+A CREATE TABLE without explicit column list cannot specify bucketing
information.
+Please use the form with explicit column list and specify bucketing
information.
+Alternatively, allow bucketing information to be inferred by omitting the
clause.
### SPECIFY_PARTITION_IS_NOT_ALLOWED
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
-It is not allowed to specify partition columns when the table schema is not
defined. When the table schema is not provided, schema and partition columns
will be inferred.
+A CREATE TABLE without explicit column list cannot specify PARTITIONED BY.
+Please use the form with explicit column list and specify PARTITIONED BY.
+Alternatively, allow partitioning to be inferred by omitting the PARTITION BY
clause.
### SQL_CONF_NOT_FOUND
-SQLSTATE: none assigned
+[SQLSTATE:
42K0I](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
The SQL config `<sqlConf>` cannot be found. Please verify that the config
exists.
@@ -1798,19 +1803,19 @@ Star (*) is not allowed in a select list when GROUP BY
an ordinal position is us
### STATIC_PARTITION_COLUMN_IN_INSERT_COLUMN_LIST
-SQLSTATE: none assigned
+[SQLSTATE:
42713](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Static partition column `<staticName>` is also specified in the column list.
### STREAM_FAILED
-SQLSTATE: none assigned
+[SQLSTATE: XXKST](sql-error-conditions-sqlstates.html#class-XX-internal-error)
Query [id = `<id>`, runId = `<runId>`] terminated with exception: `<message>`
### SUM_OF_LIMIT_AND_OFFSET_EXCEEDS_MAX_INT
-SQLSTATE: none assigned
+[SQLSTATE: 22003](sql-error-conditions-sqlstates.html#class-22-data-exception)
The sum of the LIMIT clause and the OFFSET clause must not be greater than the
maximum 32-bit integer value (2,147,483,647) but found limit = `<limit>`,
offset = `<offset>`.
@@ -1831,7 +1836,7 @@ To tolerate the error on drop use DROP VIEW IF EXISTS or
DROP TABLE IF EXISTS.
### TABLE_VALUED_FUNCTION_FAILED_TO_ANALYZE_IN_PYTHON
-SQLSTATE: none assigned
+[SQLSTATE:
38000](sql-error-conditions-sqlstates.html#class-38-external-routine-exception)
Failed to analyze the Python user defined table function: `<msg>`
@@ -1849,13 +1854,15 @@ Failed to evaluate the table function `<functionName>`
because its table metadat
### TABLE_VALUED_FUNCTION_TOO_MANY_TABLE_ARGUMENTS
-SQLSTATE: none assigned
+[SQLSTATE:
54023](sql-error-conditions-sqlstates.html#class-54-program-limit-exceeded)
-There are too many table arguments for table-valued function. It allows one
table argument, but got: `<num>`. If you want to allow it, please set
"spark.sql.allowMultipleTableArguments.enabled" to "true"
+There are too many table arguments for table-valued function.
+It allows one table argument, but got: `<num>`.
+If you want to allow it, please set
"spark.sql.allowMultipleTableArguments.enabled" to "true"
### TASK_WRITE_FAILED
-SQLSTATE: none assigned
+SQLSTATE: 58030
Task failed while writing rows to `<path>`.
@@ -1880,9 +1887,11 @@ Cannot initialize array with `<numElements>` elements of
size `<size>`.
### UDTF_ALIAS_NUMBER_MISMATCH
-SQLSTATE: none assigned
+[SQLSTATE:
42802](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
-The number of aliases supplied in the AS clause does not match the number of
columns output by the UDTF. Expected `<aliasesSize>` aliases, but got
`<aliasesNames>`. Please ensure that the number of aliases provided matches the
number of columns output by the UDTF.
+The number of aliases supplied in the AS clause does not match the number of
columns output by the UDTF.
+Expected `<aliasesSize>` aliases, but got `<aliasesNames>`.
+Please ensure that the number of aliases provided matches the number of
columns output by the UDTF.
### UNABLE_TO_ACQUIRE_MEMORY
@@ -1892,10 +1901,16 @@ Unable to acquire `<requestedBytes>` bytes of memory,
got `<receivedBytes>`.
### UNABLE_TO_CONVERT_TO_PROTOBUF_MESSAGE_TYPE
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Unable to convert SQL type `<toType>` to Protobuf type `<protobufType>`.
+### UNABLE_TO_FETCH_HIVE_TABLES
+
+SQLSTATE: 58030
+
+Unable to fetch tables of Hive database: `<dbName>`.
+
### UNABLE_TO_INFER_SCHEMA
[SQLSTATE:
42KD9](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
@@ -1928,7 +1943,7 @@ Cannot invoke function `<functionName>` because it
contains positional argument(
### UNKNOWN_PROTOBUF_MESSAGE_TYPE
-SQLSTATE: none assigned
+[SQLSTATE:
42K0G](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Attempting to treat `<descriptorName>` as a Message, but it was
`<containingType>`.
@@ -1970,9 +1985,12 @@ Unrecognized SQL type - name: `<typeName>`, id:
`<jdbcType>`.
### UNRESOLVABLE_TABLE_VALUED_FUNCTION
-SQLSTATE: none assigned
+[SQLSTATE:
42883](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
-Could not resolve `<name>` to a table-valued function. Please make sure that
`<name>` is defined as a table-valued function and that all required parameters
are provided correctly. If `<name>` is not defined, please create the
table-valued function before using it. For more information about defining
table-valued functions, please refer to the Apache Spark documentation.
+Could not resolve `<name>` to a table-valued function.
+Please make sure that `<name>` is defined as a table-valued function and that
all required parameters are provided correctly.
+If `<name>` is not defined, please create the table-valued function before
using it.
+For more information about defining table-valued functions, please refer to
the Apache Spark documentation.
### UNRESOLVED_ALL_IN_GROUP_BY
@@ -2024,13 +2042,13 @@ Cannot resolve variable `<variableName>` on search path
`<searchPath>`.
### UNSET_NONEXISTENT_PROPERTIES
-SQLSTATE: none assigned
+[SQLSTATE:
42K0J](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Attempted to unset non-existent properties [`<properties>`] in table `<table>`.
###
[UNSUPPORTED_ADD_FILE](sql-error-conditions-unsupported-add-file-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
Don't support add file.
@@ -2044,13 +2062,14 @@ Unsupported arrow type `<typeName>`.
### UNSUPPORTED_CHAR_OR_VARCHAR_AS_STRING
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
-The char/varchar type can't be used in the table schema. If you want Spark
treat them as string type as same as Spark 3.0 and earlier, please set
"spark.sql.legacy.charVarcharAsString" to "true".
+The char/varchar type can't be used in the table schema.
+If you want Spark treat them as string type as same as Spark 3.0 and earlier,
please set "spark.sql.legacy.charVarcharAsString" to "true".
### UNSUPPORTED_DATASOURCE_FOR_DIRECT_QUERY
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
Unsupported data source type for direct query on files: `<dataSourceType>`
@@ -2062,13 +2081,13 @@ Unsupported data type `<typeName>`.
### UNSUPPORTED_DATA_TYPE_FOR_DATASOURCE
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
The `<format>` datasource doesn't support the column `<columnName>` of the
type `<columnType>`.
###
[UNSUPPORTED_DEFAULT_VALUE](sql-error-conditions-unsupported-default-value-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
DEFAULT column values is not supported.
@@ -2084,15 +2103,16 @@ For more details see
[UNSUPPORTED_DESERIALIZER](sql-error-conditions-unsupported
### UNSUPPORTED_EXPRESSION_GENERATED_COLUMN
-SQLSTATE: none assigned
+[SQLSTATE:
42621](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Cannot create generated column `<fieldName>` with generation expression
`<expressionStr>` because `<reason>`.
### UNSUPPORTED_EXPR_FOR_OPERATOR
-SQLSTATE: none assigned
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
-A query operator contains one or more unsupported expressions. Consider to
rewrite it to avoid window functions, aggregate functions, and generator
functions in the WHERE clause.
+A query operator contains one or more unsupported expressions.
+Consider to rewrite it to avoid window functions, aggregate functions, and
generator functions in the WHERE clause.
Invalid expressions: [`<invalidExprSqls>`]
### UNSUPPORTED_EXPR_FOR_WINDOW
@@ -2111,7 +2131,7 @@ For more details see
[UNSUPPORTED_FEATURE](sql-error-conditions-unsupported-feat
###
[UNSUPPORTED_GENERATOR](sql-error-conditions-unsupported-generator-error-class.html)
-[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
The generator is not supported:
@@ -2119,13 +2139,13 @@ For more details see
[UNSUPPORTED_GENERATOR](sql-error-conditions-unsupported-ge
### UNSUPPORTED_GROUPING_EXPRESSION
-SQLSTATE: none assigned
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
grouping()/grouping_id() can only be used with GroupingSets/Cube/Rollup.
###
[UNSUPPORTED_INSERT](sql-error-conditions-unsupported-insert-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
42809](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Can't insert into the target.
@@ -2133,7 +2153,7 @@ For more details see
[UNSUPPORTED_INSERT](sql-error-conditions-unsupported-inser
###
[UNSUPPORTED_MERGE_CONDITION](sql-error-conditions-unsupported-merge-condition-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
MERGE operation contains unsupported `<condName>` condition.
@@ -2141,7 +2161,7 @@ For more details see
[UNSUPPORTED_MERGE_CONDITION](sql-error-conditions-unsuppor
###
[UNSUPPORTED_OVERWRITE](sql-error-conditions-unsupported-overwrite-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
42902](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Can't overwrite the target that is also being read from.
@@ -2149,7 +2169,7 @@ For more details see
[UNSUPPORTED_OVERWRITE](sql-error-conditions-unsupported-ov
###
[UNSUPPORTED_SAVE_MODE](sql-error-conditions-unsupported-save-mode-error-class.html)
-SQLSTATE: none assigned
+[SQLSTATE:
0A000](sql-error-conditions-sqlstates.html#class-0A-feature-not-supported)
The save mode `<saveMode>` is not supported for:
@@ -2222,7 +2242,7 @@ Choose a different name, drop or replace the existing
object, or add the IF NOT
### VIEW_EXCEED_MAX_NESTED_DEPTH
-SQLSTATE: none assigned
+[SQLSTATE:
54K00](sql-error-conditions-sqlstates.html#class-54-program-limit-exceeded)
The depth of view `<viewName>` exceeds the maximum view resolution depth
(`<maxNestedDepth>`).
Analysis is aborted to avoid errors. If you want to work around this, please
try to increase the value of "spark.sql.view.maxNestedViewDepth".
@@ -2237,19 +2257,19 @@ To tolerate the error on drop use DROP VIEW IF EXISTS.
### WINDOW_FUNCTION_AND_FRAME_MISMATCH
-SQLSTATE: none assigned
+[SQLSTATE:
42K0E](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
`<funcName>` function can only be evaluated in an ordered row-based window
frame with a single offset: `<windowExpr>`.
### WINDOW_FUNCTION_WITHOUT_OVER_CLAUSE
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
Window function `<funcName>` requires an OVER clause.
### WRITE_STREAM_NOT_ALLOWED
-SQLSTATE: none assigned
+[SQLSTATE:
42601](sql-error-conditions-sqlstates.html#class-42-syntax-error-or-access-rule-violation)
`writeStream` can be called only on streaming Dataset/DataFrame.
diff --git
a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
index 5396ae5ff70..8d7819de052 100644
---
a/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
+++
b/sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala
@@ -1694,7 +1694,7 @@ private[sql] object QueryExecutionErrors extends
QueryErrorsBase with ExecutionE
def cannotFetchTablesOfDatabaseError(dbName: String, e: Exception):
Throwable = {
new SparkException(
- errorClass = "_LEGACY_ERROR_TEMP_2196",
+ errorClass = "UNABLE_TO_FETCH_HIVE_TABLES",
messageParameters = Map(
"dbName" -> dbName),
cause = e)
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/ceil-floor-with-scale-param.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/ceil-floor-with-scale-param.sql.out
index 950584caa81..e4087d5108e 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/ceil-floor-with-scale-param.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/ceil-floor-with-scale-param.sql.out
@@ -82,7 +82,7 @@ SELECT CEIL(2.5, null)
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`ceil`",
"paramName" : "`scale`",
@@ -104,7 +104,7 @@ SELECT CEIL(2.5, 'a')
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`ceil`",
"paramName" : "`scale`",
@@ -226,7 +226,7 @@ SELECT FLOOR(2.5, null)
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`floor`",
"paramName" : "`scale`",
@@ -248,7 +248,7 @@ SELECT FLOOR(2.5, 'a')
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`floor`",
"paramName" : "`scale`",
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/extract.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/extract.sql.out
index eabe92ab12d..b7c5a44ae22 100644
--- a/sql/core/src/test/resources/sql-tests/analyzer-results/extract.sql.out
+++ b/sql/core/src/test/resources/sql-tests/analyzer-results/extract.sql.out
@@ -933,7 +933,7 @@ select date_part(c, c) from t
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`date_part`",
"paramName" : "`field`",
@@ -966,7 +966,7 @@ select date_part(i, i) from t
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`date_part`",
"paramName" : "`field`",
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/group-analytics.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/group-analytics.sql.out
index 592a19f593c..c46fde114c3 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/group-analytics.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/group-analytics.sql.out
@@ -333,6 +333,7 @@ SELECT course, year, GROUPING(course) FROM courseSales
GROUP BY course, year
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -349,6 +350,7 @@ SELECT course, year, GROUPING_ID(course, year) FROM
courseSales GROUP BY course,
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -397,7 +399,8 @@ SELECT course, year FROM courseSales GROUP BY course, year
HAVING GROUPING(cours
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -406,7 +409,8 @@ SELECT course, year FROM courseSales GROUP BY course, year
HAVING GROUPING_ID(co
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -465,7 +469,8 @@ SELECT course, year FROM courseSales GROUP BY course, year
ORDER BY GROUPING(cou
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -474,7 +479,8 @@ SELECT course, year FROM courseSales GROUP BY course, year
ORDER BY GROUPING_ID(
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/named-function-arguments.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/named-function-arguments.sql.out
index 4ba47e9e1b4..735e5a9da40 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/named-function-arguments.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/named-function-arguments.sql.out
@@ -155,7 +155,7 @@ SELECT * FROM explode(collection => explode(array(1)))
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
@@ -175,7 +175,7 @@ SELECT * FROM explode(collection => explode(collection =>
array(1)))
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/postgreSQL/window_part3.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/postgreSQL/window_part3.sql.out
index 3bd67cf7e64..f0f111c6857 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/postgreSQL/window_part3.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/postgreSQL/window_part3.sql.out
@@ -319,6 +319,7 @@ SELECT * FROM empsalary INNER JOIN tenk1 ON row_number()
OVER (ORDER BY salary)
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_EXPR_FOR_OPERATOR",
+ "sqlState" : "42K0E",
"messageParameters" : {
"invalidExprSqls" : "\"row_number() OVER (ORDER BY salary ASC NULLS FIRST
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)\""
},
@@ -338,6 +339,7 @@ SELECT rank() OVER (ORDER BY 1), count(*) FROM empsalary
GROUP BY 1
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_EXPR_FOR_OPERATOR",
+ "sqlState" : "42K0E",
"messageParameters" : {
"invalidExprSqls" : "\"RANK() OVER (ORDER BY 1 ASC NULLS FIRST ROWS
BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/table-valued-functions.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/table-valued-functions.sql.out
index 030713ab1c2..fe4c6ec2605 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/table-valued-functions.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/table-valued-functions.sql.out
@@ -5,6 +5,7 @@ select * from dummy(3)
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNRESOLVABLE_TABLE_VALUED_FUNCTION",
+ "sqlState" : "42883",
"messageParameters" : {
"name" : "`dummy`"
},
@@ -312,7 +313,7 @@ select * from explode(explode(array(1)))
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
@@ -593,7 +594,7 @@ select * from posexplode(explode(array(1)))
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"posexplode(explode(array(1)))\""
},
@@ -924,7 +925,7 @@ select * from stack(2, explode(array(1, 2, 3)))
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"stack(2, explode(array(1, 2, 3)))\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/analyzer-results/udf/udf-group-analytics.sql.out
b/sql/core/src/test/resources/sql-tests/analyzer-results/udf/udf-group-analytics.sql.out
index ae4dbeca9b0..4ac5f287a64 100644
---
a/sql/core/src/test/resources/sql-tests/analyzer-results/udf/udf-group-analytics.sql.out
+++
b/sql/core/src/test/resources/sql-tests/analyzer-results/udf/udf-group-analytics.sql.out
@@ -206,6 +206,7 @@ SELECT course, udf(year), GROUPING(course) FROM courseSales
GROUP BY course, udf
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -222,6 +223,7 @@ SELECT course, udf(year), GROUPING_ID(course, year) FROM
courseSales GROUP BY ud
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -270,7 +272,8 @@ SELECT course, udf(year) FROM courseSales GROUP BY
udf(course), year HAVING GROU
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -279,7 +282,8 @@ SELECT course, udf(udf(year)) FROM courseSales GROUP BY
course, year HAVING GROU
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -338,7 +342,8 @@ SELECT course, udf(year) FROM courseSales GROUP BY course,
udf(year) ORDER BY GR
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -347,7 +352,8 @@ SELECT course, udf(year) FROM courseSales GROUP BY course,
udf(year) ORDER BY GR
-- !query analysis
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
diff --git
a/sql/core/src/test/resources/sql-tests/results/ceil-floor-with-scale-param.sql.out
b/sql/core/src/test/resources/sql-tests/results/ceil-floor-with-scale-param.sql.out
index b15682b0a51..86f54665ad0 100644
---
a/sql/core/src/test/resources/sql-tests/results/ceil-floor-with-scale-param.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/ceil-floor-with-scale-param.sql.out
@@ -95,7 +95,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`ceil`",
"paramName" : "`scale`",
@@ -119,7 +119,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`ceil`",
"paramName" : "`scale`",
@@ -256,7 +256,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`floor`",
"paramName" : "`scale`",
@@ -280,7 +280,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`floor`",
"paramName" : "`scale`",
diff --git a/sql/core/src/test/resources/sql-tests/results/extract.sql.out
b/sql/core/src/test/resources/sql-tests/results/extract.sql.out
index 8416327ef31..02a7315be7a 100644
--- a/sql/core/src/test/resources/sql-tests/results/extract.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/extract.sql.out
@@ -715,7 +715,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`date_part`",
"paramName" : "`field`",
@@ -747,7 +747,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "NON_FOLDABLE_ARGUMENT",
- "sqlState" : "22024",
+ "sqlState" : "42K08",
"messageParameters" : {
"funcName" : "`date_part`",
"paramName" : "`field`",
diff --git
a/sql/core/src/test/resources/sql-tests/results/group-analytics.sql.out
b/sql/core/src/test/resources/sql-tests/results/group-analytics.sql.out
index 4286034b928..f7f76242a4e 100644
--- a/sql/core/src/test/resources/sql-tests/results/group-analytics.sql.out
+++ b/sql/core/src/test/resources/sql-tests/results/group-analytics.sql.out
@@ -467,6 +467,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -485,6 +486,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -529,7 +531,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -540,7 +543,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -597,7 +601,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -608,7 +613,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
diff --git
a/sql/core/src/test/resources/sql-tests/results/named-function-arguments.sql.out
b/sql/core/src/test/resources/sql-tests/results/named-function-arguments.sql.out
index 03963ac3ef9..64a6c4b2372 100644
---
a/sql/core/src/test/resources/sql-tests/results/named-function-arguments.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/named-function-arguments.sql.out
@@ -134,7 +134,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
@@ -156,7 +156,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part3.sql.out
b/sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part3.sql.out
index 8ae38c51d3c..6cfb2cb4b45 100644
---
a/sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part3.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/postgreSQL/window_part3.sql.out
@@ -334,6 +334,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_EXPR_FOR_OPERATOR",
+ "sqlState" : "42K0E",
"messageParameters" : {
"invalidExprSqls" : "\"row_number() OVER (ORDER BY salary ASC NULLS FIRST
ROWS BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)\""
},
@@ -355,6 +356,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_EXPR_FOR_OPERATOR",
+ "sqlState" : "42K0E",
"messageParameters" : {
"invalidExprSqls" : "\"RANK() OVER (ORDER BY 1 ASC NULLS FIRST ROWS
BETWEEN UNBOUNDED PRECEDING AND CURRENT ROW)\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/results/table-valued-functions.sql.out
b/sql/core/src/test/resources/sql-tests/results/table-valued-functions.sql.out
index a65a3f72a29..a7e9ecd2543 100644
---
a/sql/core/src/test/resources/sql-tests/results/table-valued-functions.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/table-valued-functions.sql.out
@@ -7,6 +7,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNRESOLVABLE_TABLE_VALUED_FUNCTION",
+ "sqlState" : "42883",
"messageParameters" : {
"name" : "`dummy`"
},
@@ -361,7 +362,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"explode(explode(array(1)))\""
},
@@ -657,7 +658,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"posexplode(explode(array(1)))\""
},
@@ -1008,7 +1009,7 @@ struct<>
org.apache.spark.sql.AnalysisException
{
"errorClass" : "UNSUPPORTED_GENERATOR.NESTED_IN_EXPRESSIONS",
- "sqlState" : "0A000",
+ "sqlState" : "42K0E",
"messageParameters" : {
"expression" : "\"stack(2, explode(array(1, 2, 3)))\""
},
diff --git
a/sql/core/src/test/resources/sql-tests/results/udf/udf-group-analytics.sql.out
b/sql/core/src/test/resources/sql-tests/results/udf/udf-group-analytics.sql.out
index ffa25eecdce..9beee9972ab 100644
---
a/sql/core/src/test/resources/sql-tests/results/udf/udf-group-analytics.sql.out
+++
b/sql/core/src/test/resources/sql-tests/results/udf/udf-group-analytics.sql.out
@@ -209,6 +209,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -227,6 +228,7 @@ struct<>
org.apache.spark.sql.catalyst.ExtendedAnalysisException
{
"errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E",
"queryContext" : [ {
"objectType" : "",
"objectName" : "",
@@ -271,7 +273,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -282,7 +285,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -339,7 +343,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
@@ -350,7 +355,8 @@ struct<>
-- !query output
org.apache.spark.sql.AnalysisException
{
- "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION"
+ "errorClass" : "UNSUPPORTED_GROUPING_EXPRESSION",
+ "sqlState" : "42K0E"
}
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]