This is an automated email from the ASF dual-hosted git repository.

liaoxin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new dc2932a4403 [doc](log analysis) update log-analysis example (#2404)
dc2932a4403 is described below

commit dc2932a4403bb430055641ab2b2498345fd2bc82
Author: hui lai <1353307...@qq.com>
AuthorDate: Fri May 23 11:14:28 2025 +0800

    [doc](log analysis) update log-analysis example (#2404)
---
 blog/log-analysis-elasticsearch-vs-apache-doris.md     |  8 ++++----
 docs/log-storage-analysis.md                           | 18 ++++++++----------
 docs/observability/log.md                              | 18 ++++++++----------
 .../current/log-storage-analysis.md                    | 18 ++++++++----------
 .../current/observability/log-storage-analysis.md      | 18 ++++++++----------
 .../current/observability/log.md                       | 18 ++++++++----------
 .../version-2.0/log-storage-analysis.md                | 18 ++++++++----------
 .../version-2.1/log-storage-analysis.md                | 16 +++++++---------
 .../version-2.1/observability/log-storage-analysis.md  | 16 +++++++---------
 .../version-2.1/observability/log.md                   | 16 +++++++---------
 .../version-3.0/log-storage-analysis.md                | 18 ++++++++----------
 .../version-3.0/observability/log-storage-analysis.md  | 16 +++++++---------
 .../version-3.0/observability/log.md                   | 18 ++++++++----------
 .../practical-guide/log-storage-analysis.md            | 18 ++++++++----------
 versioned_docs/version-2.1/log-storage-analysis.md     | 16 +++++++---------
 versioned_docs/version-2.1/observability/log.md        | 18 ++++++++----------
 .../practical-guide/log-storage-analysis.md            | 18 ++++++++----------
 versioned_docs/version-3.0/log-storage-analysis.md     | 18 ++++++++----------
 versioned_docs/version-3.0/observability/log.md        | 18 ++++++++----------
 .../practical-guide/log-storage-analysis.md            | 18 ++++++++----------
 20 files changed, 151 insertions(+), 189 deletions(-)

diff --git a/blog/log-analysis-elasticsearch-vs-apache-doris.md 
b/blog/log-analysis-elasticsearch-vs-apache-doris.md
index a8c2bc026fc..890488c58c2 100644
--- a/blog/log-analysis-elasticsearch-vs-apache-doris.md
+++ b/blog/log-analysis-elasticsearch-vs-apache-doris.md
@@ -233,10 +233,10 @@ For JSON logs that are written into Kafka message queues, 
create [Routine Load](
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table
 COLUMNS(ts, clientip, request, status, size)
 PROPERTIES (
-    "max_batch_interval" = "10",
-    "max_batch_rows" = "1000000",
-    "max_batch_size" = "109715200",
-    "strict_mode" = "false",
+    "max_batch_interval" = "60",
+    "max_batch_rows" = "20000000",
+    "max_batch_size" = "1073741824", 
+    "load_to_single_tablet" = "true",
     "format" = "json"
 )
 FROM KAFKA (
diff --git a/docs/log-storage-analysis.md b/docs/log-storage-analysis.md
index 258ebe170cd..332e70570b3 100644
--- a/docs/log-storage-analysis.md
+++ b/docs/log-storage-analysis.md
@@ -476,15 +476,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -498,7 +496,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual).
+For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/docs/observability/log.md b/docs/observability/log.md
index a244f95a67b..e38f35ee2b8 100644
--- a/docs/observability/log.md
+++ b/docs/observability/log.md
@@ -392,15 +392,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -414,7 +412,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../../data-operate/import/import-way/routine-load-manual.md).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/log-storage-analysis.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/log-storage-analysis.md
index 78e6bd25011..1db46f0d6c4 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/log-storage-analysis.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/log-storage-analysis.md
@@ -455,15 +455,13 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -478,7 +476,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log-storage-analysis.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log-storage-analysis.md
index 14d41592983..7686bfb1ae7 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log-storage-analysis.md
@@ -374,15 +374,13 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -397,7 +395,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log.md
index efb467875b1..ce677e142c7 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/observability/log.md
@@ -384,15 +384,13 @@ chmod +x filebeat-doris-7.17.5.4
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -407,7 +405,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/log-storage-analysis.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/log-storage-analysis.md
index 80a0c89905a..87aef38492c 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/log-storage-analysis.md
@@ -455,15 +455,13 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -478,7 +476,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/log-storage-analysis.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/log-storage-analysis.md
index 14d41592983..e9b12f32ba5 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/log-storage-analysis.md
@@ -374,14 +374,12 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
 )  
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
@@ -397,7 +395,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log-storage-analysis.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log-storage-analysis.md
index 14d41592983..a55010f09b0 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log-storage-analysis.md
@@ -374,14 +374,12 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
 )  
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
@@ -397,7 +395,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log.md
index efb467875b1..4fd8cb9773e 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/observability/log.md
@@ -384,14 +384,12 @@ chmod +x filebeat-doris-7.17.5.4
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
 )  
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
@@ -407,7 +405,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/log-storage-analysis.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/log-storage-analysis.md
index 14d41592983..29bb1556963 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/log-storage-analysis.md
@@ -374,15 +374,13 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -397,7 +395,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log-storage-analysis.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log-storage-analysis.md
index 14d41592983..a55010f09b0 100644
--- 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log-storage-analysis.md
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log-storage-analysis.md
@@ -374,14 +374,12 @@ chmod +x filebeat-doris-1.0.0
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
 )  
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
@@ -397,7 +395,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log.md 
b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log.md
index efb467875b1..ce677e142c7 100644
--- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log.md
+++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/observability/log.md
@@ -384,15 +384,13 @@ chmod +x filebeat-doris-7.17.5.4
 -- 创建 routine load,从 kafka log__topic_将数据导入 log_table 表  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -407,7 +405,7 @@ FROM KAFKA (
 SHOW ROUTINE LOAD;
 ```
 
-更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](./data-operate/import/import-way/routine-load-manual)。
+更多关于 Kafka 配置和使用的说明,可参考 [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md)。
 
 **使用自定义程序采集日志**
 
diff --git a/versioned_docs/version-2.0/practical-guide/log-storage-analysis.md 
b/versioned_docs/version-2.0/practical-guide/log-storage-analysis.md
index bddb46de306..7dc10c4fded 100644
--- a/versioned_docs/version-2.0/practical-guide/log-storage-analysis.md
+++ b/versioned_docs/version-2.0/practical-guide/log-storage-analysis.md
@@ -465,15 +465,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -487,7 +485,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../../data-operate/import/import-way/routine-load-manual.md).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-2.1/log-storage-analysis.md 
b/versioned_docs/version-2.1/log-storage-analysis.md
index de063018b9d..f9c1a7af7f0 100644
--- a/versioned_docs/version-2.1/log-storage-analysis.md
+++ b/versioned_docs/version-2.1/log-storage-analysis.md
@@ -475,14 +475,12 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
 )  
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
@@ -497,7 +495,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual).
+For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-2.1/observability/log.md 
b/versioned_docs/version-2.1/observability/log.md
index a244f95a67b..e38f35ee2b8 100644
--- a/versioned_docs/version-2.1/observability/log.md
+++ b/versioned_docs/version-2.1/observability/log.md
@@ -392,15 +392,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -414,7 +412,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../../data-operate/import/import-way/routine-load-manual.md).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-2.1/practical-guide/log-storage-analysis.md 
b/versioned_docs/version-2.1/practical-guide/log-storage-analysis.md
index dddd69b9f4d..d48641b59ef 100644
--- a/versioned_docs/version-2.1/practical-guide/log-storage-analysis.md
+++ b/versioned_docs/version-2.1/practical-guide/log-storage-analysis.md
@@ -475,15 +475,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -497,7 +495,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-3.0/log-storage-analysis.md 
b/versioned_docs/version-3.0/log-storage-analysis.md
index 4c532b792f0..9ca0bcc4b1d 100644
--- a/versioned_docs/version-3.0/log-storage-analysis.md
+++ b/versioned_docs/version-3.0/log-storage-analysis.md
@@ -475,15 +475,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -497,7 +495,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual).
+For more information about Kafka, see [Routine 
Load](./data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-3.0/observability/log.md 
b/versioned_docs/version-3.0/observability/log.md
index a244f95a67b..e38f35ee2b8 100644
--- a/versioned_docs/version-3.0/observability/log.md
+++ b/versioned_docs/version-3.0/observability/log.md
@@ -392,15 +392,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+) 
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -414,7 +412,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../../data-operate/import/import-way/routine-load-manual.md).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 
diff --git a/versioned_docs/version-3.0/practical-guide/log-storage-analysis.md 
b/versioned_docs/version-3.0/practical-guide/log-storage-analysis.md
index 56c531a22fe..d6cb0a77b63 100644
--- a/versioned_docs/version-3.0/practical-guide/log-storage-analysis.md
+++ b/versioned_docs/version-3.0/practical-guide/log-storage-analysis.md
@@ -475,15 +475,13 @@ You can refer to the example below, where `property.*` 
represents Librdkafka cli
 ```SQL  
 CREATE ROUTINE LOAD load_log_kafka ON log_db.log_table  
 COLUMNS(ts, clientip, request, status, size)  
-PROPERTIES (  
-"max_batch_interval" = "10",  
-"max_batch_rows" = "1000000",  
-"max_batch_size" = "109715200",  
-"load_to_single_tablet" = "true",  
-"timeout" = "600",  
-"strict_mode" = "false",  
-"format" = "json"  
-)  
+PROPERTIES (
+"max_batch_interval" = "60",
+"max_batch_rows" = "20000000",
+"max_batch_size" = "1073741824", 
+"load_to_single_tablet" = "true",
+"format" = "json"
+)
 FROM KAFKA (  
 "kafka_broker_list" = "host:port",  
 "kafka_topic" = "log__topic_",  
@@ -497,7 +495,7 @@ FROM KAFKA (
 <br />SHOW ROUTINE LOAD;
 ```
 
-For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual).
+For more information about Kafka, see [Routine 
Load](../data-operate/import/import-way/routine-load-manual.md).
 
 **Using customized programs to collect logs**
 


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to