This is an automated email from the ASF dual-hosted git repository.

gurwls223 pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 3aa757000173 [SPARK-51898][PYTHON][CONNECT][TESTS][FOLLOW-UP] Fix 
inconsistent exception
3aa757000173 is described below

commit 3aa75700017322d6562a8f6822687ed6dea9ff27
Author: Ruifeng Zheng <ruife...@apache.org>
AuthorDate: Sun Apr 27 16:04:42 2025 +0900

    [SPARK-51898][PYTHON][CONNECT][TESTS][FOLLOW-UP] Fix inconsistent exception
    
    ### What changes were proposed in this pull request?
    
    https://github.com/apache/spark/pull/50693 enabled `SparkConnectErrorTests` 
in connect-only mode
    
    `toJSON` and `rdd` throw `PySparkNotImplementedError` in connect model, but 
`PySparkAttributeError` in connect-only model
    
    ### Why are the changes needed?
    to fix 
https://github.com/apache/spark/actions/runs/14649632571/job/41112060443
    
    ### Does this PR introduce _any_ user-facing change?
    no
    
    ### How was this patch tested?
    will be tested in daily builder
    
    ### Was this patch authored or co-authored using generative AI tooling?
    no
    
    Closes #50708 from zhengruifeng/follow_up_connect_error.
    
    Authored-by: Ruifeng Zheng <ruife...@apache.org>
    Signed-off-by: Hyukjin Kwon <gurwls...@apache.org>
---
 python/pyspark/sql/tests/connect/test_connect_error.py | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/python/pyspark/sql/tests/connect/test_connect_error.py 
b/python/pyspark/sql/tests/connect/test_connect_error.py
index 3f540f83dcb7..47963e89471c 100644
--- a/python/pyspark/sql/tests/connect/test_connect_error.py
+++ b/python/pyspark/sql/tests/connect/test_connect_error.py
@@ -23,6 +23,7 @@ from pyspark.sql.types import Row
 from pyspark.sql import functions as F
 from pyspark.errors import PySparkTypeError
 from pyspark.testing.connectutils import ReusedConnectTestCase
+from pyspark.util import is_remote_only
 
 
 class SparkConnectErrorTests(ReusedConnectTestCase):
@@ -165,6 +166,7 @@ class SparkConnectErrorTests(ReusedConnectTestCase):
             messageParameters={},
         )
 
+    @unittest.skipIf(is_remote_only(), "Disabled for remote only")
     def test_unsupported_functions(self):
         # SPARK-41225: Disable unsupported functions.
         df = self.spark.range(10)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to