This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new b985c47  [SPARK-53683] Use `Spark 3.5.7` for Spark 3 integration tests
b985c47 is described below

commit b985c4774094451803e26081748475ce3673e1aa
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Tue Sep 23 18:14:45 2025 -0700

    [SPARK-53683] Use `Spark 3.5.7` for Spark 3 integration tests
    
    ### What changes were proposed in this pull request?
    
    This PR aims to use `Spark 3.5.7` for Spark 3 integration tests.
    
    ### Why are the changes needed?
    
    Since Apache Spark 3.5.7 is available, we had better use this stable 
version than 3.5.6.
    - https://github.com/apache/spark/releases/tag/v3.5.7
    
    ### Does this PR introduce _any_ user-facing change?
    
    No.
    
    ### How was this patch tested?
    
    Pass the CIs.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #240 from dongjoon-hyun/SPARK-53683.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 .github/workflows/build_and_test.yml | 16 ++++++++--------
 1 file changed, 8 insertions(+), 8 deletions(-)

diff --git a/.github/workflows/build_and_test.yml 
b/.github/workflows/build_and_test.yml
index 9b2fbf5..7c35d83 100644
--- a/.github/workflows/build_and_test.yml
+++ b/.github/workflows/build_and_test.yml
@@ -217,11 +217,11 @@ jobs:
       run: swift test --filter NOTHING -c release
     - name: Test
       run: |
-        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.6/spark-3.5.6-bin-hadoop3.tgz?action=download
-        tar xvfz spark-3.5.6-bin-hadoop3.tgz && rm spark-3.5.6-bin-hadoop3.tgz
-        mv spark-3.5.6-bin-hadoop3 /tmp/spark
+        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download
+        tar xvfz spark-3.5.7-bin-hadoop3.tgz && rm spark-3.5.7-bin-hadoop3.tgz
+        mv spark-3.5.7-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.6
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7
         cd -
         swift test --no-parallel -c release
 
@@ -245,11 +245,11 @@ jobs:
       run: swift test --filter NOTHING -c release
     - name: Test
       run: |
-        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.6/spark-3.5.6-bin-hadoop3.tgz?action=download
-        tar xvfz spark-3.5.6-bin-hadoop3.tgz && rm spark-3.5.6-bin-hadoop3.tgz
-        mv spark-3.5.6-bin-hadoop3 /tmp/spark
+        curl -LO 
https://www.apache.org/dyn/closer.lua/spark/spark-3.5.7/spark-3.5.7-bin-hadoop3.tgz?action=download
+        tar xvfz spark-3.5.7-bin-hadoop3.tgz && rm spark-3.5.7-bin-hadoop3.tgz
+        mv spark-3.5.7-bin-hadoop3 /tmp/spark
         cd /tmp/spark/sbin
-        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.6,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.0
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
+        ./start-connect-server.sh --packages 
org.apache.spark:spark-connect_2.12:3.5.7,org.apache.iceberg:iceberg-spark-runtime-3.5_2.12:1.10.0
 -c spark.sql.catalog.local=org.apache.iceberg.spark.SparkCatalog -c 
spark.sql.catalog.local.type=hadoop -c 
spark.sql.catalog.local.warehouse=/tmp/spark/warehouse -c 
spark.sql.defaultCatalog=local
         cd -
         swift test --filter DataFrameWriterV2Tests -c release
         swift test --filter IcebergTest -c release


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to