This is an automated email from the ASF dual-hosted git repository.

diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new d0039537abd [doc](ecosystem) add datatype mapping for flink spark 
connector (#3247)
d0039537abd is described below

commit d0039537abd72947e4d64478cc543d0ca3ee67bb
Author: wudi <[email protected]>
AuthorDate: Wed Dec 31 17:45:35 2025 +0800

    [doc](ecosystem) add datatype mapping for flink spark connector (#3247)
    
    ## Versions
    
    - [x] dev
    - [x] 4.x
    - [x] 3.x
    - [x] 2.1
    
    ## Languages
    
    - [x] Chinese
    - [x] English
    
    ## Docs Checklist
    
    - [ ] Checked by AI
    - [ ] Test Cases Built
---
 docs/ecosystem/flink-doris-connector.md            | 23 ++++++++++++-
 docs/ecosystem/spark-doris-connector.md            | 38 +++++++++++++++++-----
 .../current/ecosystem/flink-doris-connector.md     | 23 ++++++++++++-
 .../current/ecosystem/spark-doris-connector.md     | 37 ++++++++++++++++-----
 .../version-2.1/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-2.1/ecosystem/spark-doris-connector.md | 37 ++++++++++++++++-----
 .../version-3.x/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-3.x/ecosystem/spark-doris-connector.md | 37 ++++++++++++++++-----
 .../version-4.x/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-4.x/ecosystem/spark-doris-connector.md | 37 ++++++++++++++++-----
 .../version-2.1/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-2.1/ecosystem/spark-doris-connector.md | 38 +++++++++++++++++-----
 .../version-3.x/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-3.x/ecosystem/spark-doris-connector.md | 38 +++++++++++++++++-----
 .../version-4.x/ecosystem/flink-doris-connector.md | 23 ++++++++++++-
 .../version-4.x/ecosystem/spark-doris-connector.md | 38 +++++++++++++++++-----
 16 files changed, 404 insertions(+), 80 deletions(-)

diff --git a/docs/ecosystem/flink-doris-connector.md 
b/docs/ecosystem/flink-doris-connector.md
index a9012046b9e..9b5921d866d 100644
--- a/docs/ecosystem/flink-doris-connector.md
+++ b/docs/ecosystem/flink-doris-connector.md
@@ -901,7 +901,7 @@ After starting the Flink cluster, you can directly run the 
following command:
 | --multi-to-one-target   | Used in combination with multi-to-one-origin, the 
configuration of the target table, for example: --multi-to-one-target "a\|b" |
 | --create-table-only     | Whether to only synchronize the structure of the 
table.      |
 
-### Type Mapping
+### Doris to Flink Data Type Mapping
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -928,6 +928,27 @@ After starting the Flink cluster, you can directly run the 
following command:
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink to Doris Data Type Mapping
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### Monitoring Metrics
 
 Flink provides multiple 
[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)
 for monitoring the indicators of the Flink cluster. The following are the 
newly added monitoring metrics for the Flink Doris Connector.
diff --git a/docs/ecosystem/spark-doris-connector.md 
b/docs/ecosystem/spark-doris-connector.md
index b58c3e5ff82..5ea6dbc11b2 100644
--- a/docs/ecosystem/spark-doris-connector.md
+++ b/docs/ecosystem/spark-doris-connector.md
@@ -21,6 +21,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -39,7 +40,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ``` 
 
@@ -62,7 +63,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.2.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -71,21 +72,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.2.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. Add the `spark-doris-connector-spark-3.5-25.2.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | Filter expression of the 
query, which is transparently transmitted to Doris. Doris uses this expression 
to complete source-side data filtering. |
 
 
-## Doris & Spark Column Type Mapping
+## Data Type Mapping from Doris to Spark
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,25 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+
+## Data Type Mapping from Spark to Doris
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 Since version 24.0.0, the return type of the Bitmap type is string type, and 
the default return value is string value `Read unsupported`.
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/flink-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/flink-doris-connector.md
index 98e64eb9b31..209376b955f 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/flink-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/flink-doris-connector.md
@@ -902,7 +902,7 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | --multi-to-one-target   | 与 multi-to-one-origin 
搭配使用,目标表的配置,比如:--multi-to-one-target "a\|b" |
 | --create-table-only     | 是否只仅仅同步表的结构                                       |
 
-### 类型映射
+### Doris 到 Flink 的数据类型映射
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -929,6 +929,27 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink 到 Doris 的数据类型映射
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### 监控指标
 
 Flink 
提供了多种[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)用于监测
 Flink 集群的指标,以下为 Flink Doris Connector 新增的监控指标。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
index 5190abf2947..6cc22599f18 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/ecosystem/spark-doris-connector.md
@@ -20,6 +20,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -37,7 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ```
 
@@ -60,7 +61,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.2.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -68,20 +69,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.2.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
+例如将 `spark-doris-connector-spark-3.5-25.2.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.2.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.2.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | 过滤读取数据的表达式,此表达式透传给 Doris。Doris 
使用此表达式完成源端数据过滤。 |
 
 
-## Doris 和 Spark 列类型映射关系
+## Doris 到 Spark 列类型映射关系
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,24 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+## Spark 到 Doris 的数据类型映射
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 从 24.0.0 版本开始,Bitmap 类型读取返回类型为字符串,默认返回字符串值 Read unsupported。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/flink-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/flink-doris-connector.md
index cf688205417..aa2153568b9 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/flink-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/flink-doris-connector.md
@@ -902,7 +902,7 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | --multi-to-one-target   | 与 multi-to-one-origin 
搭配使用,目标表的配置,比如:--multi-to-one-target "a\|b" |
 | --create-table-only     | 是否只仅仅同步表的结构                                       |
 
-### 类型映射
+### Doris 到 Flink 的数据类型映射
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -929,6 +929,27 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink 到 Doris 的数据类型映射
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### 监控指标
 
 Flink 
提供了多种[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)用于监测
 Flink 集群的指标,以下为 Flink Doris Connector 新增的监控指标。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
index 497bc25b0f4..0ff0028ba53 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/ecosystem/spark-doris-connector.md
@@ -20,6 +20,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -37,7 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ```
 
@@ -60,7 +61,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.2.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -68,20 +69,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.2.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
+例如将 `spark-doris-connector-spark-3.5-25.2.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.2.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.2.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | 过滤读取数据的表达式,此表达式透传给 Doris。Doris 
使用此表达式完成源端数据过滤。 |
 
 
-## Doris 和 Spark 列类型映射关系
+## Doris 到 Spark 列类型映射关系
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,24 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+## Spark 到 Doris 的数据类型映射
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 从 24.0.0 版本开始,Bitmap 类型读取返回类型为字符串,默认返回字符串值 Read unsupported。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/flink-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/flink-doris-connector.md
index 5dda34df616..0ddba4499b1 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/flink-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/flink-doris-connector.md
@@ -902,7 +902,7 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | --multi-to-one-target   | 与 multi-to-one-origin 
搭配使用,目标表的配置,比如:--multi-to-one-target "a\|b" |
 | --create-table-only     | 是否只仅仅同步表的结构                                       |
 
-### 类型映射
+### Doris 到 Flink 的数据类型映射
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -929,6 +929,27 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink 到 Doris 的数据类型映射
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### 监控指标
 
 Flink 
提供了多种[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)用于监测
 Flink 集群的指标,以下为 Flink Doris Connector 新增的监控指标。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/spark-doris-connector.md
index 497bc25b0f4..0ff0028ba53 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.x/ecosystem/spark-doris-connector.md
@@ -20,6 +20,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -37,7 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ```
 
@@ -60,7 +61,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.2.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -68,20 +69,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.2.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
+例如将 `spark-doris-connector-spark-3.5-25.2.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.2.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.2.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | 过滤读取数据的表达式,此表达式透传给 Doris。Doris 
使用此表达式完成源端数据过滤。 |
 
 
-## Doris 和 Spark 列类型映射关系
+## Doris 到 Spark 列类型映射关系
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,24 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+## Spark 到 Doris 的数据类型映射
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 从 24.0.0 版本开始,Bitmap 类型读取返回类型为字符串,默认返回字符串值 Read unsupported。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/flink-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/flink-doris-connector.md
index 98e64eb9b31..209376b955f 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/flink-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/flink-doris-connector.md
@@ -902,7 +902,7 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | --multi-to-one-target   | 与 multi-to-one-origin 
搭配使用,目标表的配置,比如:--multi-to-one-target "a\|b" |
 | --create-table-only     | 是否只仅仅同步表的结构                                       |
 
-### 类型映射
+### Doris 到 Flink 的数据类型映射
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -929,6 +929,27 @@ Flink Doris Connector 中集成了[Flink 
CDC](https://nightlies.apache.org/flink
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink 到 Doris 的数据类型映射
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### 监控指标
 
 Flink 
提供了多种[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)用于监测
 Flink 集群的指标,以下为 Flink Doris Connector 新增的监控指标。
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/spark-doris-connector.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/spark-doris-connector.md
index 5190abf2947..6cc22599f18 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/spark-doris-connector.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/ecosystem/spark-doris-connector.md
@@ -20,6 +20,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -37,7 +38,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ```
 
@@ -60,7 +61,7 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
 
 编译时,可直接运行 `sh build.sh`,具体可参考这里。
 
-编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.1.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
+编译成功后,会在 `dist` 目录生成目标 jar 
包,如:spark-doris-connector-spark-3.5-25.2.0.jar。将此文件复制到 `Spark` 的 `ClassPath` 
中即可使用 `Spark-Doris-Connector`。例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 
文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 也可以
 
 
@@ -68,20 +69,20 @@ Spark Doris Connector 可以支持通过 Spark 读取 Doris 中存储的数据
    `sh build.sh`
    根据提示输入你需要的 Scala 与 Spark 版本进行编译。
 
-编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.1.0.jar`。
+编译成功后,会在 `dist` 目录生成目标 jar 包,如:`spark-doris-connector-spark-3.5-25.2.0.jar`。
 将此文件复制到 `Spark` 的 `ClassPath` 中即可使用 `Spark-Doris-Connector`。
 
 例如,`Local` 模式运行的 `Spark`,将此文件放入 `jars/` 文件夹下。`Yarn`集群模式运行的`Spark`,则将此文件放入预部署包中。
 
-例如将 `spark-doris-connector-spark-3.5-25.1.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
+例如将 `spark-doris-connector-spark-3.5-25.2.0.jar` 上传到 hdfs 并在 `spark.yarn.jars` 
参数上添加 hdfs 上的 Jar 包路径
 ```shell
-1. 上传 `spark-doris-connector-spark-3.5-25.1.0.jar` 到 hdfs。
+1. 上传 `spark-doris-connector-spark-3.5-25.2.0.jar` 到 hdfs。
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. 在集群中添加 `spark-doris-connector-spark-3.5-25.1.0.jar` 依赖。
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. 在集群中添加 `spark-doris-connector-spark-3.5-25.2.0.jar` 依赖。
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | 过滤读取数据的表达式,此表达式透传给 Doris。Doris 
使用此表达式完成源端数据过滤。 |
 
 
-## Doris 和 Spark 列类型映射关系
+## Doris 到 Spark 列类型映射关系
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,24 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+## Spark 到 Doris 的数据类型映射
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 从 24.0.0 版本开始,Bitmap 类型读取返回类型为字符串,默认返回字符串值 Read unsupported。
diff --git a/versioned_docs/version-2.1/ecosystem/flink-doris-connector.md 
b/versioned_docs/version-2.1/ecosystem/flink-doris-connector.md
index 493f4278310..d76b3eb15ec 100644
--- a/versioned_docs/version-2.1/ecosystem/flink-doris-connector.md
+++ b/versioned_docs/version-2.1/ecosystem/flink-doris-connector.md
@@ -901,7 +901,7 @@ After starting the Flink cluster, you can directly run the 
following command:
 | --multi-to-one-target   | Used in combination with multi-to-one-origin, the 
configuration of the target table, for example: --multi-to-one-target "a\|b" |
 | --create-table-only     | Whether to only synchronize the structure of the 
table.      |
 
-### Type Mapping
+### Doris to Flink Data Type Mapping
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -928,6 +928,27 @@ After starting the Flink cluster, you can directly run the 
following command:
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink to Doris Data Type Mapping
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### Monitoring Metrics
 
 Flink provides multiple 
[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)
 for monitoring the indicators of the Flink cluster. The following are the 
newly added monitoring metrics for the Flink Doris Connector.
diff --git a/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md 
b/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
index f152e4ed569..cd695879b7c 100644
--- a/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
+++ b/versioned_docs/version-2.1/ecosystem/spark-doris-connector.md
@@ -21,6 +21,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -39,7 +40,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ``` 
 
@@ -62,7 +63,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.2.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -71,21 +72,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.2.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. Add the `spark-doris-connector-spark-3.5-25.2.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | Filter expression of the 
query, which is transparently transmitted to Doris. Doris uses this expression 
to complete source-side data filtering. |
 
 
-## Doris & Spark Column Type Mapping
+## Data Type Mapping from Doris to Spark
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,25 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+
+## Data Type Mapping from Spark to Doris
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 Since version 24.0.0, the return type of the Bitmap type is string type, and 
the default return value is string value `Read unsupported`.
diff --git a/versioned_docs/version-3.x/ecosystem/flink-doris-connector.md 
b/versioned_docs/version-3.x/ecosystem/flink-doris-connector.md
index 493f4278310..d76b3eb15ec 100644
--- a/versioned_docs/version-3.x/ecosystem/flink-doris-connector.md
+++ b/versioned_docs/version-3.x/ecosystem/flink-doris-connector.md
@@ -901,7 +901,7 @@ After starting the Flink cluster, you can directly run the 
following command:
 | --multi-to-one-target   | Used in combination with multi-to-one-origin, the 
configuration of the target table, for example: --multi-to-one-target "a\|b" |
 | --create-table-only     | Whether to only synchronize the structure of the 
table.      |
 
-### Type Mapping
+### Doris to Flink Data Type Mapping
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -928,6 +928,27 @@ After starting the Flink cluster, you can directly run the 
following command:
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink to Doris Data Type Mapping
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### Monitoring Metrics
 
 Flink provides multiple 
[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)
 for monitoring the indicators of the Flink cluster. The following are the 
newly added monitoring metrics for the Flink Doris Connector.
diff --git a/versioned_docs/version-3.x/ecosystem/spark-doris-connector.md 
b/versioned_docs/version-3.x/ecosystem/spark-doris-connector.md
index f152e4ed569..cd695879b7c 100644
--- a/versioned_docs/version-3.x/ecosystem/spark-doris-connector.md
+++ b/versioned_docs/version-3.x/ecosystem/spark-doris-connector.md
@@ -21,6 +21,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -39,7 +40,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ``` 
 
@@ -62,7 +63,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.2.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -71,21 +72,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.2.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. Add the `spark-doris-connector-spark-3.5-25.2.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | Filter expression of the 
query, which is transparently transmitted to Doris. Doris uses this expression 
to complete source-side data filtering. |
 
 
-## Doris & Spark Column Type Mapping
+## Data Type Mapping from Doris to Spark
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,25 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+
+## Data Type Mapping from Spark to Doris
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 Since version 24.0.0, the return type of the Bitmap type is string type, and 
the default return value is string value `Read unsupported`.
diff --git a/versioned_docs/version-4.x/ecosystem/flink-doris-connector.md 
b/versioned_docs/version-4.x/ecosystem/flink-doris-connector.md
index a9012046b9e..9b5921d866d 100644
--- a/versioned_docs/version-4.x/ecosystem/flink-doris-connector.md
+++ b/versioned_docs/version-4.x/ecosystem/flink-doris-connector.md
@@ -901,7 +901,7 @@ After starting the Flink cluster, you can directly run the 
following command:
 | --multi-to-one-target   | Used in combination with multi-to-one-origin, the 
configuration of the target table, for example: --multi-to-one-target "a\|b" |
 | --create-table-only     | Whether to only synchronize the structure of the 
table.      |
 
-### Type Mapping
+### Doris to Flink Data Type Mapping
 
 | Doris Type | Flink Type |
 | ---------- | ---------- |
@@ -928,6 +928,27 @@ After starting the Flink cluster, you can directly run the 
following command:
 | IPV4       | STRING     |
 | IPV6       | STRING     |
 
+### Flink to Doris Data Type Mapping
+| Flink Type    | Doris Type     |
+| ------------- | -------------- |
+| BOOLEAN       | BOOLEAN        |
+| TINYINT       | TINYINT        |
+| SMALLINT      | SMALLINT       |
+| INTEGER       | INTEGER        |
+| BIGINT        | BIGINT         |
+| FLOAT         | FLOAT          |
+| DOUBLE        | DOUBLE         |
+| DECIMAL       | DECIMAL        |
+| CHAR          | CHAR           |
+| VARCHAR       | VARCHAR/STRING |
+| STRING        | STRING         |
+| DATE          | DATE           |
+| TIMESTAMP     | DATETIME       |
+| TIMESTAMP_LTZ | DATETIME       |
+| ARRAY         | ARRAY          |
+| MAP           | MAP/JSON       |
+| ROW           | STRUCT/JSON    |
+
 ### Monitoring Metrics
 
 Flink provides multiple 
[Metrics](https://nightlies.apache.org/flink/flink-docs-master/docs/ops/metrics/#metrics)
 for monitoring the indicators of the Flink cluster. The following are the 
newly added monitoring metrics for the Flink Doris Connector.
diff --git a/versioned_docs/version-4.x/ecosystem/spark-doris-connector.md 
b/versioned_docs/version-4.x/ecosystem/spark-doris-connector.md
index b58c3e5ff82..5ea6dbc11b2 100644
--- a/versioned_docs/version-4.x/ecosystem/spark-doris-connector.md
+++ b/versioned_docs/version-4.x/ecosystem/spark-doris-connector.md
@@ -21,6 +21,7 @@ Github: https://github.com/apache/doris-spark-connector
 
 | Connector | Spark               | Doris       | Java | Scala      |
 |-----------|---------------------|-------------|------|------------|
+| 25.2.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.1.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.1    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
 | 25.0.0    | 3.5 - 3.1, 2.4      | 1.0 +       | 8    | 2.12, 2.11 |
@@ -39,7 +40,7 @@ Github: https://github.com/apache/doris-spark-connector
 <dependency>
     <groupId>org.apache.doris</groupId>
     <artifactId>spark-doris-connector-spark-3.5</artifactId>
-    <version>25.1.0</version>
+    <version>25.2.0</version>
 </dependency>
 ``` 
 
@@ -62,7 +63,7 @@ Starting from version 24.0.0, the naming rules of the Doris 
connector package ha
 
 When compiling, you can directly run `sh build.sh`, for details, please refer 
to here.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.1.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: spark-doris-connector-spark-3.5-25.2.0.jar. Copy 
this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`. For 
example, for `Spark` running in `Local` mode, put this file in the `jars/` 
folder. For `Spark` running in `Yarn` cluster mode, put this file in the 
pre-deployment package.
 You can also
 
 Execute in the source code directory:
@@ -71,21 +72,21 @@ Execute in the source code directory:
 
 Enter the Scala and Spark versions you need to compile according to the 
prompts.
 
-After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.1.0.jar`.
+After successful compilation, the target jar package will be generated in the 
`dist` directory, such as: `spark-doris-connector-spark-3.5-25.2.0.jar`.
 Copy this file to the `ClassPath` of `Spark` to use `Spark-Doris-Connector`.
 
 For example, if `Spark` is running in `Local` mode, put this file in the 
`jars/` folder. If `Spark` is running in `Yarn` cluster mode, put this file in 
the pre-deployment package.
 
-For example, upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
+For example, upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs and 
add the Jar package path on hdfs to the `spark.yarn.jars` parameter
 ```shell
 
-1. Upload `spark-doris-connector-spark-3.5-25.1.0.jar` to hdfs.
+1. Upload `spark-doris-connector-spark-3.5-25.2.0.jar` to hdfs.
 
 hdfs dfs -mkdir /spark-jars/
-hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.1.0.jar 
/spark-jars/
+hdfs dfs -put /your_local_path/spark-doris-connector-spark-3.5-25.2.0.jar 
/spark-jars/
 
-2. Add the `spark-doris-connector-spark-3.5-25.1.0.jar` dependency in the 
cluster.
-spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.1.0.jar
+2. Add the `spark-doris-connector-spark-3.5-25.2.0.jar` dependency in the 
cluster.
+spark.yarn.jars=hdfs:///spark-jars/spark-doris-connector-spark-3.5-25.2.0.jar
 
 ```
 
@@ -449,7 +450,7 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | doris.filter.query          | --            | Filter expression of the 
query, which is transparently transmitted to Doris. Doris uses this expression 
to complete source-side data filtering. |
 
 
-## Doris & Spark Column Type Mapping
+## Data Type Mapping from Doris to Spark
 
 | Doris Type | Spark Type              |
 |------------|-------------------------|
@@ -474,6 +475,25 @@ insert into 
your_catalog_name.your_doris_db.your_doris_table select * from your_
 | HLL        | DataTypes.StringType    |
 | Bitmap     | DataTypes.StringType    |
 
+
+## Data Type Mapping from Spark to Doris
+
+| Spark Type     | Doris Type     |
+|----------------|----------------|
+| BooleanType    | BOOLEAN        |
+| ShortType      | SMALLINT       |
+| IntegerType    | INT            |
+| LongType       | BIGINT         |
+| FloatType      | FLOAT          |
+| DoubleType     | DOUBLE         |
+| DecimalType    | DECIMAL        |
+| StringType     | VARCHAR/STRING |
+| DateType       | DATE           |
+| TimestampType  | DATETIME       |
+| ArrayType      | ARRAY          |
+| MapType        | MAP/JSON       |
+| StructType     | STRUCT/JSON    |
+
 :::tip
 
 Since version 24.0.0, the return type of the Bitmap type is string type, and 
the default return value is string value `Read unsupported`.


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to