This is an automated email from the ASF dual-hosted git repository.
dongjoon pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/spark-docker.git
The following commit(s) were added to refs/heads/master by this push:
new bbfad01 [SPARK-52543] Download Apache Spark distributions via ASF
Mirrors site
bbfad01 is described below
commit bbfad0134f7a3534f4cd735ccd7b1ffe84ce6ae8
Author: Dongjoon Hyun <[email protected]>
AuthorDate: Fri Jun 20 13:10:26 2025 -0700
[SPARK-52543] Download Apache Spark distributions via ASF Mirrors site
### What changes were proposed in this pull request?
This PR aims to download `Apache Spark` distributions via ASF Mirrors site.
### Why are the changes needed?
ASF Mirros are the recommended place instead of the flaky
`archive.apache.org` .
-
https://github.com/apache/spark-docker/actions/runs/15784378078/job/44497425327?pr=87
```
0.064 + wget -nv -O spark.tgz
https://archive.apache.org/dist/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz
133.8 failed: Connection timed out.
133.8 failed: Network is unreachable.
```
### Does this PR introduce _any_ user-facing change?
No.
### How was this patch tested?
Pass the CIs.
Closes #88 from dongjoon-hyun/SPARK-52543.
Authored-by: Dongjoon Hyun <[email protected]>
Signed-off-by: Dongjoon Hyun <[email protected]>
---
4.0.0/scala2.13-java17-ubuntu/Dockerfile | 4 ++--
4.0.0/scala2.13-java21-ubuntu/Dockerfile | 4 ++--
Dockerfile.template | 4 ++--
3 files changed, 6 insertions(+), 6 deletions(-)
diff --git a/4.0.0/scala2.13-java17-ubuntu/Dockerfile
b/4.0.0/scala2.13-java17-ubuntu/Dockerfile
index a94344f..031fc3e 100644
--- a/4.0.0/scala2.13-java17-ubuntu/Dockerfile
+++ b/4.0.0/scala2.13-java17-ubuntu/Dockerfile
@@ -36,8 +36,8 @@ RUN set -ex; \
# Install Apache Spark
# https://downloads.apache.org/spark/KEYS
-ENV
SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz
\
-
SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz.asc
\
+ENV
SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz?action=download
\
+
SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz.asc?action=download
\
GPG_KEY=4DC9676CEF9A83E98FCA02784D6620843CD87F5A
RUN set -ex; \
diff --git a/4.0.0/scala2.13-java21-ubuntu/Dockerfile
b/4.0.0/scala2.13-java21-ubuntu/Dockerfile
index 33093f3..15bd36b 100644
--- a/4.0.0/scala2.13-java21-ubuntu/Dockerfile
+++ b/4.0.0/scala2.13-java21-ubuntu/Dockerfile
@@ -36,8 +36,8 @@ RUN set -ex; \
# Install Apache Spark
# https://downloads.apache.org/spark/KEYS
-ENV
SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz
\
-
SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz.asc
\
+ENV
SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz?action=download
\
+
SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-4.0.0/spark-4.0.0-bin-hadoop3.tgz.asc?action=download
\
GPG_KEY=4DC9676CEF9A83E98FCA02784D6620843CD87F5A
RUN set -ex; \
diff --git a/Dockerfile.template b/Dockerfile.template
index 1ef733a..a410e06 100644
--- a/Dockerfile.template
+++ b/Dockerfile.template
@@ -36,8 +36,8 @@ RUN set -ex; \
# Install Apache Spark
# https://downloads.apache.org/spark/KEYS
-ENV SPARK_TGZ_URL=https://archive.apache.org/dist/spark/spark-{{ SPARK_VERSION
}}/spark-{{ SPARK_VERSION }}-bin-hadoop3.tgz \
- SPARK_TGZ_ASC_URL=https://archive.apache.org/dist/spark/spark-{{
SPARK_VERSION }}/spark-{{ SPARK_VERSION }}-bin-hadoop3.tgz.asc \
+ENV SPARK_TGZ_URL=https://www.apache.org/dyn/closer.lua/spark/spark-{{
SPARK_VERSION }}/spark-{{ SPARK_VERSION }}-bin-hadoop3.tgz?action=download \
+ SPARK_TGZ_ASC_URL=https://www.apache.org/dyn/closer.lua/spark/spark-{{
SPARK_VERSION }}/spark-{{ SPARK_VERSION }}-bin-hadoop3.tgz.asc?action=download \
GPG_KEY={{ SPARK_GPG_KEY }}
RUN set -ex; \
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]