This is an automated email from the ASF dual-hosted git repository.

diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new fc5d9a3ebcd [doc](ecosystem)update spark connector version to 25.1.0 
(#2429)
fc5d9a3ebcd is described below

commit fc5d9a3ebcd616f78e8caf594a470813eac5e072
Author: wudi <w...@selectdb.com>
AuthorDate: Fri May 30 14:30:54 2025 +0800

    [doc](ecosystem)update spark connector version to 25.1.0 (#2429)
    
    ## Versions
    
    - [x] dev
    - [x] 3.0
    - [x] 2.1
    - [ ] 2.0
    
    ## Languages
    
    - [x] Chinese
    - [x] English
    
    ## Docs Checklist
    
    - [ ] Checked by AI
    - [ ] Test Cases Built
---
 docs/ecosystem/spark-doris-connector.md                 | 17 +++++++++--------
 .../current/ecosystem/spark-doris-connector.md          | 17 +++++++++--------
 .../version-2.1/ecosystem/spark-doris-connector.md      | 17 +++++++++--------
 .../version-3.0/ecosystem/spark-doris-connector.md      | 17 +++++++++--------
 .../version-2.1/ecosystem/spark-doris-connector.md      | 17 +++++++++--------
 .../version-3.0/ecosystem/spark-doris-connector.md      | 17 +++++++++--------
 6 files changed, 54 insertions(+), 48 deletions(-)

diff --git a/docs/ecosystem/spark-doris-connector.md 
b/docs/ecosystem/spark-doris-connector.md
index 566e9661a40..5454201349c 100644
--- a/docs/ecosystem/spark-doris-connector.md
+++ b/docs/ecosystem/spark-doris-connector.md
@@ -39,6 +39,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 24.0.0    | 3.5 ~ 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -56,7 +57,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ``` 
 
@@ -79,7 +80,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.0.1.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -88,21 +89,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.0.1.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.0.1.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
index 4f38bb4aa6c..599c615bc72 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
@@ -38,6 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 1.3.2     | 3.4 - 3.1, 2.4, 2.3 | 1.0 - 2.1.6 | 8    | 2.12, 2.11 |
@@ -54,7 +55,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ```
 
@@ -77,7 +78,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:spark-doris-connector-spark-3.5-25.0.1.jar。 
将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。 例如,`Local` 模式运行的 
`Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -85,20 +86,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.0.1.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.0.1.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar包路径
+例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.0.1.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.0.1.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
index 449c45d6287..9a397f57ed5 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
@@ -38,6 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 1.3.2     | 3.4 - 3.1, 2.4, 2.3 | 1.0 - 2.1.6 | 8    | 2.12, 2.11 |
@@ -54,7 +55,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ```
 
@@ -77,7 +78,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:spark-doris-connector-spark-3.5-25.0.1.jar。 
将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。 例如,`Local` 模式运行的 
`Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -85,20 +86,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.0.1.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.0.1.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar包路径
+例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.0.1.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.0.1.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/spark-doris-connector.md
index 449c45d6287..9a397f57ed5 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/ecosystem/spark-doris-connector.md
@@ -38,6 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 1.3.2     | 3.4 - 3.1, 2.4, 2.3 | 1.0 - 2.1.6 | 8    | 2.12, 2.11 |
@@ -54,7 +55,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ```
 
@@ -77,7 +78,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:spark-doris-connector-spark-3.5-25.0.1.jar。 
将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。 例如,`Local` 模式运行的 
`Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -85,20 +86,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.0.1.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.0.1.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar包路径
+例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.0.1.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.0.1.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 
diff --git a/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md 
b/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
index 4e0660fac0d..8e3d239cfbf 100644
--- a/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
+++ b/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
@@ -39,6 +39,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 24.0.0    | 3.5 ~ 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -56,7 +57,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ``` 
 
@@ -79,7 +80,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.0.1.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -88,21 +89,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.0.1.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.0.1.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 
diff --git a/versioned_docs/version-3.0/ecosystem/spark-doris-connector.md 
b/versioned_docs/version-3.0/ecosystem/spark-doris-connector.md
index 4e0660fac0d..8e3d239cfbf 100644
--- a/versioned_docs/version-3.0/ecosystem/spark-doris-connector.md
+++ b/versioned_docs/version-3.0/ecosystem/spark-doris-connector.md
@@ -39,6 +39,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 24.0.0    | 3.5 ~ 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -56,7 +57,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.0.1</version>
+    <version>25.1.0</version>
 </dependency>
 ``` 
 
@@ -79,7 +80,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.0.1.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -88,21 +89,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.0.1.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.0.1.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.0.1.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.0.1.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.0.1.jar
+2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
 
 ```
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to