This is an automated email from the ASF dual-hosted git repository.

dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git


The following commit(s) were added to refs/heads/main by this push:
     new 07a276e  [SPARK-53724] Update `Examples` and documentations to use 
`4.0.1`
07a276e is described below

commit 07a276e0b77135894509d5637d8e5447345afe26
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Thu Sep 25 18:44:05 2025 -0700

    [SPARK-53724] Update `Examples` and documentations to use `4.0.1`
    
    ### What changes were proposed in this pull request?
    
    This PR aims to update `Examples` and documentations to use `4.0.1`.
    
    ### Why are the changes needed?
    
    Apache Spark community highly recommends to use `4.0.1` for all Spark 4.0 
users.
    
    ### Does this PR introduce _any_ user-facing change?
    
    No behavior change.
    
    ### How was this patch tested?
    
    Pass the CIs and manual review.
    
    ### Was this patch authored or co-authored using generative AI tooling?
    
    No.
    
    Closes #245 from dongjoon-hyun/SPARK-53724.
    
    Authored-by: Dongjoon Hyun <[email protected]>
    Signed-off-by: Dongjoon Hyun <[email protected]>
---
 Examples/app/README.md                              |  6 +++---
 Examples/pi/README.md                               |  2 +-
 Examples/spark-sql/README.md                        | 10 +++++-----
 Examples/stream/README.md                           |  4 ++--
 Examples/web/README.md                              |  4 ++--
 README.md                                           |  4 ++--
 Sources/SparkConnect/Documentation.docc/Examples.md |  6 +++---
 7 files changed, 18 insertions(+), 18 deletions(-)

diff --git a/Examples/app/README.md b/Examples/app/README.md
index 1a4b82f..754c6f6 100644
--- a/Examples/app/README.md
+++ b/Examples/app/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to use Apache 
Spark Connect Swi
 Prepare `Spark Connect Server` via running Docker image.
 
 ```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
 ```
 
 Build an application Docker image.
@@ -23,7 +23,7 @@ Run `app` docker image.
 
 ```bash
 $ docker run --rm -e SPARK_REMOTE=sc://host.docker.internal:15002 
apache/spark-connect-swift:app
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 EXECUTE: DROP TABLE IF EXISTS t
 EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT) USING ORC
 EXECUTE: INSERT INTO t VALUES (1), (2), (3)
@@ -52,7 +52,7 @@ Run from source code.
 ```bash
 $ swift run
 ...
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 EXECUTE: DROP TABLE IF EXISTS t
 EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT) USING ORC
 EXECUTE: INSERT INTO t VALUES (1), (2), (3)
diff --git a/Examples/pi/README.md b/Examples/pi/README.md
index d1e203e..3502911 100644
--- a/Examples/pi/README.md
+++ b/Examples/pi/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to use Apache 
Spark Connect Swi
 Prepare `Spark Connect Server` via running Docker image.
 
 ```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
 ```
 
 Build an application Docker image.
diff --git a/Examples/spark-sql/README.md b/Examples/spark-sql/README.md
index 015079d..2be2846 100644
--- a/Examples/spark-sql/README.md
+++ b/Examples/spark-sql/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to develop a 
Spark SQL REPL(Rea
 Prepare `Spark Connect Server` via running Docker image.
 
 ```bash
-docker run -it --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run -it --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
 ```
 
 Build an application Docker image.
@@ -23,7 +23,7 @@ Run `spark-sql` docker image.
 
 ```bash
 $ docker run -it --rm -e SPARK_REMOTE=sc://host.docker.internal:15002 
apache/spark-connect-swift:spark-sql
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 spark-sql (default)> SHOW DATABASES;
 +---------+
 |namespace|
@@ -85,13 +85,13 @@ spark-sql (default)> DROP DATABASE db1 CASCADE;
 spark-sql (default)> exit;
 ```
 
-Apache Spark 4 supports [SQL Pipe 
Syntax](https://spark.apache.org/docs/4.0.0/sql-pipe-syntax.html).
+Apache Spark 4 supports [SQL Pipe 
Syntax](https://spark.apache.org/docs/4.0.1/sql-pipe-syntax.html).
 
 ```bash
 $ swift run
 ...
 Build of product 'SparkSQLRepl' complete! (2.33s)
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 spark-sql (default)>
 FROM ORC.`/opt/spark/examples/src/main/resources/users.orc`
 |> AGGREGATE COUNT(*) cnt
@@ -113,6 +113,6 @@ Run from source code.
 ```bash
 $ swift run
 ...
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 spark-sql (default)>
 ```
diff --git a/Examples/stream/README.md b/Examples/stream/README.md
index f4eb2fc..4c621a4 100644
--- a/Examples/stream/README.md
+++ b/Examples/stream/README.md
@@ -5,7 +5,7 @@ This is an example Swift stream processing application to show 
how to count word
 ## Run `Spark Connect Server`
 
 ```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
+docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
 ```
 
 ## Run `Netcat` as a streaming input server
@@ -81,5 +81,5 @@ Batch: 2
 ```bash
 $ swift run
 ...
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 ```
diff --git a/Examples/web/README.md b/Examples/web/README.md
index 1668c0d..c9a1174 100644
--- a/Examples/web/README.md
+++ b/Examples/web/README.md
@@ -90,7 +90,7 @@ index 2edcc8f..dd918a9 100644
 Prepare `Spark Connect Server` via running Docker image.
 
 ```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait"
 ```
 
 Build an application Docker image.
@@ -116,7 +116,7 @@ $ curl http://127.0.0.1:8080/
 Welcome to the Swift world. Say hello!%
 
 $ curl http://127.0.0.1:8080/hello
-Hi, this is powered by the Apache Spark 4.0.0.%
+Hi, this is powered by the Apache Spark 4.0.1.%
 ```
 
 Run from source code.
diff --git a/README.md b/README.md
index 3ebaa5a..cec09f0 100644
--- a/README.md
+++ b/README.md
@@ -16,7 +16,7 @@ So far, this library project is tracking the upstream changes 
of [Apache Arrow](
 
 ## Requirement
 
-- [Apache Spark 4.0.0 (May 
2025)](https://github.com/apache/spark/releases/tag/v4.0.0)
+- [Apache Spark 4.0.1 (September 
2025)](https://github.com/apache/spark/releases/tag/v4.0.1)
 - [Swift 6.0/6.1/6.2 (September 2025)](https://swift.org)
 - [gRPC Swift 2.1 (July 
2025)](https://github.com/grpc/grpc-swift-2/releases/tag/2.1.0)
 - [gRPC Swift Protobuf 2.1 (August 
2025)](https://github.com/grpc/grpc-swift-protobuf/releases/tag/2.1.1)
@@ -91,7 +91,7 @@ Run your Swift application.
 ```bash
 $ swift run
 ...
-Connected to Apache Spark 4.0.0 Server
+Connected to Apache Spark 4.0.1 Server
 EXECUTE: DROP TABLE IF EXISTS t
 EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT)
 EXECUTE: INSERT INTO t VALUES (1), (2), (3)
diff --git a/Sources/SparkConnect/Documentation.docc/Examples.md 
b/Sources/SparkConnect/Documentation.docc/Examples.md
index 5858d18..7a1a7ac 100644
--- a/Sources/SparkConnect/Documentation.docc/Examples.md
+++ b/Sources/SparkConnect/Documentation.docc/Examples.md
@@ -7,7 +7,7 @@ This document provides an overview of the example applications 
inside [Examples]
 Start a Spark Connect Server:
 
 ```bash
-docker run -it --rm -p 15002:15002 apache/spark:4.0.0 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
+docker run -it --rm -p 15002:15002 apache/spark:4.0.1 bash -c 
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
 ```
 
 ## Basic Application Example
@@ -154,7 +154,7 @@ Welcome to the Swift world. Say hello!%
 
 # Spark-powered endpoint
 curl http://127.0.0.1:8080/hello
-Hi, this is powered by the Apache Spark 4.0.0.%
+Hi, this is powered by the Apache Spark 4.0.1.%
 ```
 
 ## Development Environment
@@ -166,4 +166,4 @@ All examples include:
 - A README.md with detailed instructions
 - Source code in the Sources directory
 
-These examples are designed to be used with Apache Spark 4.0.0 or newer, using 
the Spark Connect protocol for client-server interaction.
+These examples are designed to be used with Apache Spark 4.0 or newer, using 
the Spark Connect protocol for client-server interaction.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to