This is an automated email from the ASF dual-hosted git repository.
diwu pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git
The following commit(s) were added to refs/heads/master by this push:
new 4350da91a30 [doc](ecosystem) fix typo for streaming job (#3435)
4350da91a30 is described below
commit 4350da91a30f9093c38e03e6a689f56be044f5f2
Author: wudi <[email protected]>
AuthorDate: Thu Mar 5 20:29:03 2026 +0800
[doc](ecosystem) fix typo for streaming job (#3435)
## Versions
- [x] dev
- [x] 4.x
- [ ] 3.x
- [ ] 2.1
## Languages
- [x] Chinese
- [x] English
## Docs Checklist
- [ ] Checked by AI
- [ ] Test Cases Built
---
.../data-source/migrate-data-from-other-oltp.md | 24 ++++++++++++++++++++++
.../streaming-job/streaming-job-multi-table.md | 6 +++---
.../import/streaming-job/streaming-job-tvf.md | 10 ++++-----
.../data-source/migrate-data-from-other-oltp.md | 24 ++++++++++++++++++++++
.../streaming-job/streaming-job-multi-table.md | 6 +++---
.../import/streaming-job/streaming-job-tvf.md | 10 ++++-----
.../version-4.x.json | 4 ++++
.../data-source/migrate-data-from-other-oltp.md | 24 ++++++++++++++++++++++
.../streaming-job/streaming-job-multi-table.md | 6 +++---
.../import/streaming-job/streaming-job-tvf.md | 8 ++++----
.../data-source/migrate-data-from-other-oltp.md | 24 ++++++++++++++++++++++
.../streaming-job/streaming-job-multi-table.md | 6 +++---
.../import/streaming-job/streaming-job-tvf.md | 10 ++++-----
13 files changed, 131 insertions(+), 31 deletions(-)
diff --git
a/docs/data-operate/import/data-source/migrate-data-from-other-oltp.md
b/docs/data-operate/import/data-source/migrate-data-from-other-oltp.md
index 66f6e96557d..c971ae4727b 100644
--- a/docs/data-operate/import/data-source/migrate-data-from-other-oltp.md
+++ b/docs/data-operate/import/data-source/migrate-data-from-other-oltp.md
@@ -155,6 +155,30 @@ val jdbcDF = spark.read
```
For more details, refer to [JDBC To Other
Databases](https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html),[Spark
Doris Connector](../../../ecosystem/spark-doris-connector.md#batch-write)
+## Streaming Job
+
+Use Streaming Job to synchronize data from MySQL or Postgres.
+
+```sql
+CREATE JOB multi_table_sync
+ON STREAMING
+FROM MYSQL (
+ "jdbc_url" = "jdbc:mysql://127.0.0.1:3306",
+ "driver_url" = "mysql-connector-j-8.0.31.jar",
+ "driver_class" = "com.mysql.cj.jdbc.Driver",
+ "user" = "root",
+ "password" = "123456",
+ "database" = "test",
+ "include_tables" = "user_info,order_info",
+ "offset" = "initial"
+)
+TO DATABASE target_test_db (
+ "table.create.properties.replication_num" = "1"
+)
+```
+
+For more details, refer to: [Postgres/MySQL Continuous
Load](../streaming-job/streaming-job-multi-table.md)
+
## DataX / Seatunnel / CloudCanal and other third-party tools.
In addition, you can also use third-party synchronization tools for data
synchronization. For more details, please refer to:
diff --git
a/docs/data-operate/import/streaming-job/streaming-job-multi-table.md
b/docs/data-operate/import/streaming-job/streaming-job-multi-table.md
index 88201f9d8ed..5ac0d6ea910 100644
--- a/docs/data-operate/import/streaming-job/streaming-job-multi-table.md
+++ b/docs/data-operate/import/streaming-job/streaming-job-multi-table.md
@@ -89,7 +89,7 @@ TO DATABASE target_test_db (
### Check Import Status
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1765332859199
Name: mysql_db_sync
Definer: root
@@ -212,7 +212,7 @@ TO DATABASE <target_db> (
After submitting a job, you can run the following SQL to check the job status:
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
*************************** 1. row ***************************
Id: 1765332859199
Name: mysql_db_sync
@@ -259,7 +259,7 @@ CanceledTaskCount: 0
You can run the following SQL to check the status of each Task:
```sql
-select * from tasks(type='insert') where jobId='1765336137066'
+select * from tasks("type"="insert") where jobId='1765336137066'
*************************** 1. row ***************************
TaskId: 1765336137066
JobId: 1765332859199
diff --git a/docs/data-operate/import/streaming-job/streaming-job-tvf.md
b/docs/data-operate/import/streaming-job/streaming-job-tvf.md
index 4948f23c44d..b06b72a9386 100644
--- a/docs/data-operate/import/streaming-job/streaming-job-tvf.md
+++ b/docs/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -50,7 +50,7 @@ select * from S3(
### Check import status
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -165,10 +165,10 @@ The module description is as follows:
#### Job
-After a job is successfully submitted, you can execute **select \* from
job("insert") where ExecuteType = 'Streaming'** to check the current status of
the job.
+After a job is successfully submitted, you can execute **select \* from
jobs("type"="insert") where ExecuteType = 'STREAMING'** to check the current
status of the job.
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -223,12 +223,12 @@ The specific parameter results are displayed as follows:
#### Task
-You can execute `select \* from tasks(type='insert') where
jobId='1758534452459'` to view the running status of each Task.
+You can execute `select \* from tasks("type"="insert") where
jobId='1758534452459'` to view the running status of each Task.
Note: Only the latest Task information will be retained.
```SQL
-mysql> select * from tasks(type='insert') where jobId='1758534452459'\G
+mysql> select * from tasks("type"="insert") where jobId='1758534452459'\G
*************************** 1. row ***************************
TaskId: 1758534723330
JobId: 1758534452459
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/migrate-data-from-other-oltp.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/migrate-data-from-other-oltp.md
index bf346f37338..c09866a034a 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/migrate-data-from-other-oltp.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/migrate-data-from-other-oltp.md
@@ -155,6 +155,30 @@ val jdbcDF = spark.read
```
具体可参考:[JDBC To Other
Databases](https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html),[Spark
Doris Connector](../../../ecosystem/spark-doris-connector.md#批量写入)
+## Streaming Job
+
+使用 Streaming Job 同步 MySQL 或者 Postgres 的数据
+
+```sql
+CREATE JOB multi_table_sync
+ON STREAMING
+FROM MYSQL (
+ "jdbc_url" = "jdbc:mysql://127.0.0.1:3306",
+ "driver_url" = "mysql-connector-j-8.0.31.jar",
+ "driver_class" = "com.mysql.cj.jdbc.Driver",
+ "user" = "root",
+ "password" = "123456",
+ "database" = "test",
+ "include_tables" = "user_info,order_info",
+ "offset" = "initial"
+)
+TO DATABASE target_test_db (
+ "table.create.properties.replication_num" = "1"
+)
+```
+
+具体可参考: [Postgres/MySQL Continuous
Load](../streaming-job/streaming-job-multi-table.md)
+
## DataX / Seatunnel / CloudCanal 等三方工具
除此之外,也可以使用第三方同步工具来进行数据同步,更多可参考:
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-multi-table.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-multi-table.md
index 7430040f130..54e1a9e8c76 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-multi-table.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-multi-table.md
@@ -89,7 +89,7 @@ TO DATABASE target_test_db (
### 查看导入状态
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1765332859199
Name: mysql_db_sync
Definer: root
@@ -212,7 +212,7 @@ TO DATABASE <target_db> (
Job 提交成功后,可以执行如下 SQL 查看 Job 当前状态:
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
*************************** 1. row ***************************
Id: 1765332859199
Name: mysql_db_sync
@@ -261,7 +261,7 @@ CanceledTaskCount: 0
可以执行如下 SQL 查看每次 Task 的运行状态:
```sql
-select * from tasks(type='insert') where jobId='1765336137066'
+select * from tasks("type"="insert") where jobId='1765336137066'
*************************** 1. row ***************************
TaskId: 1765336137066
JobId: 1765332859199
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-tvf.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-tvf.md
index 6bc6b3c98a6..35aae05f1d5 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-tvf.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -50,7 +50,7 @@ select * from S3(
### 查看导入状态
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -164,10 +164,10 @@ DO <Insert_Command>
#### Job
-Job 提交成功后,可以执行 **select \* from job("insert") where ExecuteType =
'Streaming'** 来查看 Job 当前的状态
+Job 提交成功后,可以执行 **select \* from jobs("type"="insert") where ExecuteType =
'STREAMING'** 来查看 Job 当前的状态
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -222,12 +222,12 @@ CanceledTaskCount: 0
#### Task
-可以执行**select \* from tasks(type='insert') where jobId='1758534452459'** 来查看每次
Task 的运行状态。
+可以执行**select \* from tasks("type"="insert") where jobId='1758534452459'**
来查看每次 Task 的运行状态。
注:只会保留当前最新的一次 Task 信息。
```SQL
-mysql> select * from tasks(type='insert') where jobId='1758534452459'\G
+mysql> select * from tasks("type"="insert") where jobId='1758534452459'\G
*************************** 1. row ***************************
TaskId: 1758534723330
JobId: 1758534452459
diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x.json
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x.json
index 0a27493b93e..1b8ddc599eb 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x.json
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x.json
@@ -139,6 +139,10 @@
"message": "导入原理",
"description": "The label for category Load Internals in sidebar docs"
},
+ "sidebar.docs.category.Continuous Load": {
+ "message": "持续导入",
+ "description": "The label for category Continuous Load in sidebar docs"
+ },
"sidebar.docs.category.Updating Data": {
"message": "数据更新",
"description": "The label for category Update in sidebar docs"
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
index bf346f37338..c09866a034a 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
@@ -155,6 +155,30 @@ val jdbcDF = spark.read
```
具体可参考:[JDBC To Other
Databases](https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html),[Spark
Doris Connector](../../../ecosystem/spark-doris-connector.md#批量写入)
+## Streaming Job
+
+使用 Streaming Job 同步 MySQL 或者 Postgres 的数据
+
+```sql
+CREATE JOB multi_table_sync
+ON STREAMING
+FROM MYSQL (
+ "jdbc_url" = "jdbc:mysql://127.0.0.1:3306",
+ "driver_url" = "mysql-connector-j-8.0.31.jar",
+ "driver_class" = "com.mysql.cj.jdbc.Driver",
+ "user" = "root",
+ "password" = "123456",
+ "database" = "test",
+ "include_tables" = "user_info,order_info",
+ "offset" = "initial"
+)
+TO DATABASE target_test_db (
+ "table.create.properties.replication_num" = "1"
+)
+```
+
+具体可参考: [Postgres/MySQL Continuous
Load](../streaming-job/streaming-job-multi-table.md)
+
## DataX / Seatunnel / CloudCanal 等三方工具
除此之外,也可以使用第三方同步工具来进行数据同步,更多可参考:
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
index 7430040f130..54e1a9e8c76 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
@@ -89,7 +89,7 @@ TO DATABASE target_test_db (
### 查看导入状态
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1765332859199
Name: mysql_db_sync
Definer: root
@@ -212,7 +212,7 @@ TO DATABASE <target_db> (
Job 提交成功后,可以执行如下 SQL 查看 Job 当前状态:
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
*************************** 1. row ***************************
Id: 1765332859199
Name: mysql_db_sync
@@ -261,7 +261,7 @@ CanceledTaskCount: 0
可以执行如下 SQL 查看每次 Task 的运行状态:
```sql
-select * from tasks(type='insert') where jobId='1765336137066'
+select * from tasks("type"="insert") where jobId='1765336137066'
*************************** 1. row ***************************
TaskId: 1765336137066
JobId: 1765332859199
diff --git
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
index 6bc6b3c98a6..e96b85344e1 100644
---
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
+++
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -164,10 +164,10 @@ DO <Insert_Command>
#### Job
-Job 提交成功后,可以执行 **select \* from job("insert") where ExecuteType =
'Streaming'** 来查看 Job 当前的状态
+Job 提交成功后,可以执行 **select \* from job("type"="insert") where ExecuteType =
'STREAMING'** 来查看 Job 当前的状态
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from job("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -222,12 +222,12 @@ CanceledTaskCount: 0
#### Task
-可以执行**select \* from tasks(type='insert') where jobId='1758534452459'** 来查看每次
Task 的运行状态。
+可以执行**select \* from tasks("type"="insert") where jobId='1758534452459'**
来查看每次 Task 的运行状态。
注:只会保留当前最新的一次 Task 信息。
```SQL
-mysql> select * from tasks(type='insert') where jobId='1758534452459'\G
+mysql> select * from tasks("type"="insert") where jobId='1758534452459'\G
*************************** 1. row ***************************
TaskId: 1758534723330
JobId: 1758534452459
diff --git
a/versioned_docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
b/versioned_docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
index 66f6e96557d..c971ae4727b 100644
---
a/versioned_docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
+++
b/versioned_docs/version-4.x/data-operate/import/data-source/migrate-data-from-other-oltp.md
@@ -155,6 +155,30 @@ val jdbcDF = spark.read
```
For more details, refer to [JDBC To Other
Databases](https://spark.apache.org/docs/latest/sql-data-sources-jdbc.html),[Spark
Doris Connector](../../../ecosystem/spark-doris-connector.md#batch-write)
+## Streaming Job
+
+Use Streaming Job to synchronize data from MySQL or Postgres.
+
+```sql
+CREATE JOB multi_table_sync
+ON STREAMING
+FROM MYSQL (
+ "jdbc_url" = "jdbc:mysql://127.0.0.1:3306",
+ "driver_url" = "mysql-connector-j-8.0.31.jar",
+ "driver_class" = "com.mysql.cj.jdbc.Driver",
+ "user" = "root",
+ "password" = "123456",
+ "database" = "test",
+ "include_tables" = "user_info,order_info",
+ "offset" = "initial"
+)
+TO DATABASE target_test_db (
+ "table.create.properties.replication_num" = "1"
+)
+```
+
+For more details, refer to: [Postgres/MySQL Continuous
Load](../streaming-job/streaming-job-multi-table.md)
+
## DataX / Seatunnel / CloudCanal and other third-party tools.
In addition, you can also use third-party synchronization tools for data
synchronization. For more details, please refer to:
diff --git
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
index 88201f9d8ed..5ac0d6ea910 100644
---
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
+++
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-multi-table.md
@@ -89,7 +89,7 @@ TO DATABASE target_test_db (
### Check Import Status
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1765332859199
Name: mysql_db_sync
Definer: root
@@ -212,7 +212,7 @@ TO DATABASE <target_db> (
After submitting a job, you can run the following SQL to check the job status:
```sql
-select * from jobs(type=insert) where ExecuteType = "STREAMING"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
*************************** 1. row ***************************
Id: 1765332859199
Name: mysql_db_sync
@@ -259,7 +259,7 @@ CanceledTaskCount: 0
You can run the following SQL to check the status of each Task:
```sql
-select * from tasks(type='insert') where jobId='1765336137066'
+select * from tasks("type"="insert") where jobId='1765336137066'
*************************** 1. row ***************************
TaskId: 1765336137066
JobId: 1765332859199
diff --git
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
index 4948f23c44d..b06b72a9386 100644
---
a/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
+++
b/versioned_docs/version-4.x/data-operate/import/streaming-job/streaming-job-tvf.md
@@ -50,7 +50,7 @@ select * from S3(
### Check import status
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -165,10 +165,10 @@ The module description is as follows:
#### Job
-After a job is successfully submitted, you can execute **select \* from
job("insert") where ExecuteType = 'Streaming'** to check the current status of
the job.
+After a job is successfully submitted, you can execute **select \* from
jobs("type"="insert") where ExecuteType = 'STREAMING'** to check the current
status of the job.
```SQL
-select * from job(type=insert) where ExecuteType = "streaming"
+select * from jobs("type"="insert") where ExecuteType = "STREAMING"
Id: 1758538737484
Name: my_job1
Definer: root
@@ -223,12 +223,12 @@ The specific parameter results are displayed as follows:
#### Task
-You can execute `select \* from tasks(type='insert') where
jobId='1758534452459'` to view the running status of each Task.
+You can execute `select \* from tasks("type"="insert") where
jobId='1758534452459'` to view the running status of each Task.
Note: Only the latest Task information will be retained.
```SQL
-mysql> select * from tasks(type='insert') where jobId='1758534452459'\G
+mysql> select * from tasks("type"="insert") where jobId='1758534452459'\G
*************************** 1. row ***************************
TaskId: 1758534723330
JobId: 1758534452459
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]