This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark.git


The following commit(s) were added to refs/heads/master by this push:
     new 3b10407fbf7e [SPARK-54066][PYTHON][TESTS] Skip 
`test_in_memory_data_source` in Python 3.14
3b10407fbf7e is described below

commit 3b10407fbf7e189ab2c9b490ffecec1f45752a02
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Tue Oct 28 20:14:23 2025 -0700

    [SPARK-54066][PYTHON][TESTS] Skip `test_in_memory_data_source` in Python 
3.14
    
    ### What changes were proposed in this pull request?
    
    This PR aims to skip `test_in_memory_data_source` in Python 3.14 for now in 
order to reveal other failures from the CI.
    
    ### Why are the changes needed?
    
    The test case fails in both `classic` and `connect` modes and SPARK-54065 
is created to fix it after further investigation in both modes.
    - SPARK-54065 Fix `test_in_memory_data_source` in Python 3.14
    
    ```
    ======================================================================
    ERROR [0.007s]: test_in_memory_data_source 
(pyspark.sql.tests.test_python_datasource.PythonDataSourceTests.test_in_memory_data_source)
    ```
    
    ```
    ======================================================================
    ERROR [0.014s]: test_in_memory_data_source 
(pyspark.sql.tests.connect.test_parity_python_datasource.PythonDataSourceParityTests.test_in_memory_data_source)
    ```
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs. Manually test on Python 3.14.
    
    ```
    $ python/run-tests --parallelism 1 --testnames 
pyspark.sql.tests.test_python_datasource --python-executables python3
    Running PySpark tests. Output is in 
/Users/dongjoon/APACHE/spark-merge/python/unit-tests.log
    Will test against the following Python executables: ['python3']
    Will test the following Python tests: 
['pyspark.sql.tests.test_python_datasource']
    python3 python_implementation is CPython
    python3 version is: Python 3.14.0
    Starting test(python3): pyspark.sql.tests.test_python_datasource (temp 
output: 
/Users/dongjoon/APACHE/spark-merge/python/target/0b6ce5cf-6fca-4146-9fdc-739521349dce/python3__pyspark.sql.tests.test_python_datasource__0iy64w4h.log)
    Finished test(python3): pyspark.sql.tests.test_python_datasource (21s) ... 
1 tests were skipped
    Tests passed in 21 seconds
    
    Skipped tests in pyspark.sql.tests.test_python_datasource with python3:
          test_in_memory_data_source 
(pyspark.sql.tests.test_python_datasource.PythonDataSourceTests.test_in_memory_data_source)
 ... skip (0.000s)
    ```
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #52769 from dongjoon-hyun/SPARK-54066.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 python/pyspark/sql/tests/test_python_datasource.py | 2 ++
 1 file changed, 2 insertions(+)

diff --git a/python/pyspark/sql/tests/test_python_datasource.py 
b/python/pyspark/sql/tests/test_python_datasource.py
index 04d9d22dd99e..28f8bf3c832b 100644
--- a/python/pyspark/sql/tests/test_python_datasource.py
+++ b/python/pyspark/sql/tests/test_python_datasource.py
@@ -16,6 +16,7 @@
 #
 import os
 import platform
+import sys
 import tempfile
 import unittest
 from datetime import datetime
@@ -252,6 +253,7 @@ class BasePythonDataSourceTestsMixin:
         with self.assertRaisesRegex(PythonException, 
"DATA_SOURCE_INVALID_RETURN_TYPE"):
             df.collect()
 
+    @unittest.skipIf(sys.version_info > (3, 13), "SPARK-54065")
     def test_in_memory_data_source(self):
         class InMemDataSourceReader(DataSourceReader):
             DEFAULT_NUM_PARTITIONS: int = 3


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to