This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch main
in repository https://gitbox.apache.org/repos/asf/spark-connect-swift.git
The following commit(s) were added to refs/heads/main by this push:
new e45ce47 [SPARK-54773] Make `README.md` and `Examples` up-to-date with
4.1.0
e45ce47 is described below
commit e45ce47dda2a67995686bc79aa6f04e783bc8652
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Dec 19 12:39:06 2025 -0800
[SPARK-54773] Make `README.md` and `Examples` up-to-date with 4.1.0
### What changes were proposed in this pull request?
This PR aims to make `README.md` and `Examples` up-to-date with Apache
Spark 4.1.0.
### Why are the changes needed?
To guide a user to use the latest Apache Spark release.
### Does this PR introduce _any_ user-facing change?
No behavior change. This is a documentation-only change.
### How was this patch tested?
Manual review.
### Was this patch authored or co-authored using generative AI tooling?
No.
Closes #267 from dongjoon-hyun/SPARK-54773.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
Examples/app/README.md | 45 +++++++--------
Examples/pi/README.md | 10 ++--
Examples/spark-sql/README.md | 65 ++++++++++------------
Examples/stream/README.md | 10 ++--
Examples/web/README.md | 8 +--
README.md | 33 +++++------
.../SparkConnect/Documentation.docc/Examples.md | 4 +-
7 files changed, 86 insertions(+), 89 deletions(-)
diff --git a/Examples/app/README.md b/Examples/app/README.md
index 754c6f6..7c1f874 100644
--- a/Examples/app/README.md
+++ b/Examples/app/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to use Apache
Spark Connect Swi
Prepare `Spark Connect Server` via running Docker image.
```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
```
Build an application Docker image.
@@ -15,15 +15,15 @@ Build an application Docker image.
```bash
$ docker build -t apache/spark-connect-swift:app .
$ docker images apache/spark-connect-swift:app
-REPOSITORY TAG IMAGE ID CREATED SIZE
-apache/spark-connect-swift app e132e1b38348 5 seconds ago 368MB
+IMAGE ID DISK USAGE CONTENT SIZE
EXTRA
+apache/spark-connect-swift:app fa24c1e88713 550MB 128MB
```
Run `app` docker image.
```bash
$ docker run --rm -e SPARK_REMOTE=sc://host.docker.internal:15002
apache/spark-connect-swift:app
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
EXECUTE: DROP TABLE IF EXISTS t
EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT) USING ORC
EXECUTE: INSERT INTO t VALUES (1), (2), (3)
@@ -31,19 +31,19 @@ SELECT * FROM t
+---+
| a|
+---+
-| 2|
-| 1|
| 3|
+| 1|
+| 2|
+---+
+---+
| id|
+---+
| 0|
+| 4|
+| 2|
| 8|
| 6|
-| 2|
-| 4|
+---+
```
@@ -52,25 +52,26 @@ Run from source code.
```bash
$ swift run
...
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
EXECUTE: DROP TABLE IF EXISTS t
EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT) USING ORC
EXECUTE: INSERT INTO t VALUES (1), (2), (3)
SELECT * FROM t
+---+
-| a |
+| a|
++---+
+| 1|
+| 2|
+| 3|
++---+
+
++---+
+| id|
+---+
-| 2 |
-| 1 |
-| 3 |
+| 8|
+| 4|
+| 0|
+| 6|
+| 2|
+---+
-+----+
-| id |
-+----+
-| 2 |
-| 6 |
-| 0 |
-| 8 |
-| 4 |
-+----+
```
diff --git a/Examples/pi/README.md b/Examples/pi/README.md
index 3502911..873e9ba 100644
--- a/Examples/pi/README.md
+++ b/Examples/pi/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to use Apache
Spark Connect Swi
Prepare `Spark Connect Server` via running Docker image.
```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
```
Build an application Docker image.
@@ -15,15 +15,15 @@ Build an application Docker image.
```bash
$ docker build -t apache/spark-connect-swift:pi .
$ docker images apache/spark-connect-swift:pi
-REPOSITORY TAG IMAGE ID CREATED SIZE
-apache/spark-connect-swift pi d03952577564 4 seconds ago 369MB
+IMAGE ID DISK USAGE CONTENT SIZE
EXTRA
+apache/spark-connect-swift:pi cae3fd3b9833 550MB 128MB
```
Run `pi` docker image.
```bash
$ docker run --rm -e SPARK_REMOTE=sc://host.docker.internal:15002
apache/spark-connect-swift:pi
-Pi is roughly 3.1412831412831412
+Pi is roughly 3.1434711434711433
```
Run from source code.
@@ -31,5 +31,5 @@ Run from source code.
```bash
$ swift run
...
-Pi is roughly 3.1423711423711422
+Pi is roughly 3.143995143995144
```
diff --git a/Examples/spark-sql/README.md b/Examples/spark-sql/README.md
index 2be2846..c7aaa2e 100644
--- a/Examples/spark-sql/README.md
+++ b/Examples/spark-sql/README.md
@@ -7,7 +7,7 @@ This is an example Swift application to show how to develop a
Spark SQL REPL(Rea
Prepare `Spark Connect Server` via running Docker image.
```bash
-docker run -it --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run -it --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
```
Build an application Docker image.
@@ -15,15 +15,15 @@ Build an application Docker image.
```bash
$ docker build -t apache/spark-connect-swift:spark-sql .
$ docker images apache/spark-connect-swift:spark-sql
-REPOSITORY TAG IMAGE ID CREATED SIZE
-apache/spark-connect-swift spark-sql 265ddfec650d 7 seconds ago 390MB
+IMAGE ID DISK USAGE CONTENT SIZE
EXTRA
+apache/spark-connect-swift:spark-sql f97a7af7b0ff 550MB 128MB
```
Run `spark-sql` docker image.
```bash
$ docker run -it --rm -e SPARK_REMOTE=sc://host.docker.internal:15002
apache/spark-connect-swift:spark-sql
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
spark-sql (default)> SHOW DATABASES;
+---------+
|namespace|
@@ -31,67 +31,70 @@ spark-sql (default)> SHOW DATABASES;
|default |
+---------+
-Time taken: 30 ms
+Time taken: 276 ms
spark-sql (default)> CREATE DATABASE db1;
++
||
++
++
-Time taken: 31 ms
+Time taken: 63 ms
spark-sql (default)> USE db1;
++
||
++
++
-Time taken: 27 ms
+Time taken: 41 ms
spark-sql (db1)> CREATE TABLE t1 AS SELECT * FROM RANGE(10);
++
||
++
++
-Time taken: 99 ms
+Time taken: 893 ms
spark-sql (db1)> SELECT * FROM t1;
+---+
-| id|
+|id |
+---+
-| 1|
-| 5|
-| 3|
-| 0|
-| 6|
-| 9|
-| 4|
-| 8|
-| 7|
-| 2|
+|3 |
+|8 |
+|5 |
+|9 |
+|1 |
+|6 |
+|7 |
+|0 |
+|4 |
+|2 |
+---+
-Time taken: 80 ms
+Time taken: 267 ms
spark-sql (db1)> USE default;
++
||
++
++
-Time taken: 26 ms
+Time taken: 28 ms
spark-sql (default)> DROP DATABASE db1 CASCADE;
++
||
++
++
+
+Time taken: 46 ms
spark-sql (default)> exit;
```
-Apache Spark 4 supports [SQL Pipe
Syntax](https://spark.apache.org/docs/4.0.1/sql-pipe-syntax.html).
+Apache Spark 4 supports [SQL Pipe
Syntax](https://spark.apache.org/docs/4.1.0/sql-pipe-syntax.html).
+Run from source code at this time.
```bash
$ swift run
...
Build of product 'SparkSQLRepl' complete! (2.33s)
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
spark-sql (default)>
FROM ORC.`/opt/spark/examples/src/main/resources/users.orc`
|> AGGREGATE COUNT(*) cnt
@@ -99,20 +102,12 @@ FROM ORC.`/opt/spark/examples/src/main/resources/users.orc`
|> ORDER BY cnt DESC, name ASC
;
+------+---+
-| name|cnt|
+|name |cnt|
+------+---+
-|Alyssa| 1|
-| Ben| 1|
+|Alyssa|1 |
+|Ben |1 |
+------+---+
-Time taken: 159 ms
+Time taken: 550 ms
```
-Run from source code.
-
-```bash
-$ swift run
-...
-Connected to Apache Spark 4.0.1 Server
-spark-sql (default)>
-```
diff --git a/Examples/stream/README.md b/Examples/stream/README.md
index 4c621a4..2c6b87f 100644
--- a/Examples/stream/README.md
+++ b/Examples/stream/README.md
@@ -5,7 +5,7 @@ This is an example Swift stream processing application to show
how to count word
## Run `Spark Connect Server`
```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
+docker run --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
```
## Run `Netcat` as a streaming input server
@@ -23,8 +23,8 @@ Build an application Docker image.
```bash
$ docker build -t apache/spark-connect-swift:stream .
$ docker images apache/spark-connect-swift:stream
-REPOSITORY TAG IMAGE ID CREATED SIZE
-apache/spark-connect-swift stream a4daa10ad9c5 7 seconds ago 369MB
+IMAGE ID DISK USAGE CONTENT SIZE
EXTRA
+apache/spark-connect-swift:stream 683d4bd67cec 550MB 128MB
```
Run `stream` docker image.
@@ -79,7 +79,7 @@ Batch: 2
## Run from source code
```bash
-$ swift run
+$ TARGET_HOST=host.docker.internal swift run
...
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
```
diff --git a/Examples/web/README.md b/Examples/web/README.md
index c9a1174..77f968b 100644
--- a/Examples/web/README.md
+++ b/Examples/web/README.md
@@ -90,7 +90,7 @@ index 2edcc8f..dd918a9 100644
Prepare `Spark Connect Server` via running Docker image.
```bash
-docker run --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
+docker run --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait"
```
Build an application Docker image.
@@ -98,8 +98,8 @@ Build an application Docker image.
```bash
$ docker build -t apache/spark-connect-swift:web .
$ docker images apache/spark-connect-swift:web
-REPOSITORY TAG IMAGE ID CREATED SIZE
-apache/spark-connect-swift web 3fd2422fdbee 27 seconds ago 417MB
+IMAGE ID DISK USAGE CONTENT SIZE
EXTRA
+apache/spark-connect-swift:web a3865bf1867e 600MB 139MB
```
Run `web` docker image
@@ -116,7 +116,7 @@ $ curl http://127.0.0.1:8080/
Welcome to the Swift world. Say hello!%
$ curl http://127.0.0.1:8080/hello
-Hi, this is powered by the Apache Spark 4.0.1.%
+Hi, this is powered by the Apache Spark 4.1.0.%
```
Run from source code.
diff --git a/README.md b/README.md
index 99e8595..45e0e4a 100644
--- a/README.md
+++ b/README.md
@@ -16,7 +16,7 @@ So far, this library project is tracking the upstream changes
of [Apache Arrow](
## Requirement
-- [Apache Spark 4.0.1 (September
2025)](https://github.com/apache/spark/releases/tag/v4.0.1)
+- [Apache Spark 4.1.0 (December
2025)](https://github.com/apache/spark/releases/tag/v4.1.0)
- [Swift 6.2 (September 2025)](https://swift.org)
- [gRPC Swift 2.2 (November
2025)](https://github.com/grpc/grpc-swift-2/releases/tag/2.2.0)
- [gRPC Swift Protobuf 2.1 (August
2025)](https://github.com/grpc/grpc-swift-protobuf/releases/tag/2.1.1)
@@ -91,27 +91,28 @@ Run your Swift application.
```bash
$ swift run
...
-Connected to Apache Spark 4.0.1 Server
+Connected to Apache Spark 4.1.0 Server
EXECUTE: DROP TABLE IF EXISTS t
-EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT)
+EXECUTE: CREATE TABLE IF NOT EXISTS t(a INT) USING ORC
EXECUTE: INSERT INTO t VALUES (1), (2), (3)
SELECT * FROM t
+---+
-| a |
+| a|
+---+
-| 2 |
-| 1 |
-| 3 |
+| 1|
+| 3|
+| 2|
++---+
+
++---+
+| id|
++---+
+| 6|
+| 8|
+| 4|
+| 2|
+| 0|
+---+
-+----+
-| id |
-+----+
-| 2 |
-| 6 |
-| 0 |
-| 8 |
-| 4 |
-+----+
```
You can find more complete examples including `Spark SQL REPL`, `Web Server`
and `Streaming` applications in the
[Examples](https://github.com/apache/spark-connect-swift/tree/main/Examples)
directory.
diff --git a/Sources/SparkConnect/Documentation.docc/Examples.md
b/Sources/SparkConnect/Documentation.docc/Examples.md
index 7a1a7ac..50b83fd 100644
--- a/Sources/SparkConnect/Documentation.docc/Examples.md
+++ b/Sources/SparkConnect/Documentation.docc/Examples.md
@@ -7,7 +7,7 @@ This document provides an overview of the example applications
inside [Examples]
Start a Spark Connect Server:
```bash
-docker run -it --rm -p 15002:15002 apache/spark:4.0.1 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
+docker run -it --rm -p 15002:15002 apache/spark:4.1.0 bash -c
"/opt/spark/sbin/start-connect-server.sh --wait -c spark.log.level=ERROR"
```
## Basic Application Example
@@ -154,7 +154,7 @@ Welcome to the Swift world. Say hello!%
# Spark-powered endpoint
curl http://127.0.0.1:8080/hello
-Hi, this is powered by the Apache Spark 4.0.1.%
+Hi, this is powered by the Apache Spark 4.1.0.%
```
## Development Environment
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]