morrySnow commented on code in PR #1915:
URL: https://github.com/apache/doris-website/pull/1915#discussion_r1928499911


##########
docs/sql-manual/sql-statements/data-modification/load-and-export/ALTER-ROUTINE-LOAD.md:
##########
@@ -27,98 +27,96 @@ under the License.
 
 ## Description
 
-This syntax is used to modify an already created routine import job.
+This syntax is used to modify an existing routine load job. Only jobs in 
PAUSED state can be modified.
 
-Only jobs in the PAUSED state can be modified.
-
-grammar:
+## Syntax
 
 ```sql
-ALTER ROUTINE LOAD FOR [db.]job_name
-[job_properties]
-FROM data_source
-[data_source_properties]
+ALTER ROUTINE LOAD FOR [<db.>]<job_name>
+[<job_properties>]
+FROM <data_source>
+[<data_source_properties>]
 ```
 
-1. `[db.]job_name`
-
-   Specifies the job name to modify.
-
-2. `tbl_name`
-
-   Specifies the name of the table to be imported.
-
-3. `job_properties`
-
-   Specifies the job parameters that need to be modified. Currently, only the 
modification of the following parameters is supported:
-
-   1. `desired_concurrent_number`
-   2. `max_error_number`
-   3. `max_batch_interval`
-   4. `max_batch_rows`
-   5. `max_batch_size`
-   6. `jsonpaths`
-   7. `json_root`
-   8. `strip_outer_array`
-   9. `strict_mode`
-   10. `timezone`
-   11. `num_as_string`
-   12. `fuzzy_parse`
-   13. `partial_columns`
-   14. `max_filter_ratio`
-
-
-4. `data_source`
-
-   The type of data source. Currently supports:
-
-   KAFKA
-
-5. `data_source_properties`
-
-   Relevant properties of the data source. Currently only supports:
-
-   1. `kafka_partitions`
-   2. `kafka_offsets`
-   3. `kafka_broker_list`
-   4. `kafka_topic`
-   5. Custom properties, such as `property.group.id`
-
-   Note:
-
-   1. `kafka_partitions` and `kafka_offsets` are used to modify the offset of 
the kafka partition to be consumed, only the currently consumed partition can 
be modified. Cannot add partition.
-
-## Example
-
-1. Change `desired_concurrent_number` to 1
-
-   ```sql
-   ALTER ROUTINE LOAD FOR db1.label1
-   PROPERTIES
-   (
-       "desired_concurrent_number" = "1"
-   );
-   ```
-
-2. Modify `desired_concurrent_number` to 10, modify the offset of the 
partition, and modify the group id.
-
-   ```sql
-   ALTER ROUTINE LOAD FOR db1.label1
-   PROPERTIES
-   (
-       "desired_concurrent_number" = "10"
-   )
-   FROM kafka
-   (
-       "kafka_partitions" = "0, 1, 2",
-       "kafka_offsets" = "100, 200, 100",
-       "property.group.id" = "new_group"
-   );
-   ```
-
-## Keywords
-
-    ALTER, ROUTINE, LOAD
-
-## Best Practice
-
+## Required Parameters
+
+**1. `[db.]job_name`**

Review Comment:
   ```suggestion
   **1. `[<db>.]<job_name>`**
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/ALTER-ROUTINE-LOAD.md:
##########
@@ -27,98 +27,96 @@ under the License.
 
 ## Description
 
-This syntax is used to modify an already created routine import job.
+This syntax is used to modify an existing routine load job. Only jobs in 
PAUSED state can be modified.
 
-Only jobs in the PAUSED state can be modified.
-
-grammar:
+## Syntax
 
 ```sql
-ALTER ROUTINE LOAD FOR [db.]job_name
-[job_properties]
-FROM data_source
-[data_source_properties]
+ALTER ROUTINE LOAD FOR [<db.>]<job_name>
+[<job_properties>]
+FROM <data_source>
+[<data_source_properties>]
 ```
 
-1. `[db.]job_name`
-
-   Specifies the job name to modify.
-
-2. `tbl_name`
-
-   Specifies the name of the table to be imported.
-
-3. `job_properties`
-
-   Specifies the job parameters that need to be modified. Currently, only the 
modification of the following parameters is supported:
-
-   1. `desired_concurrent_number`
-   2. `max_error_number`
-   3. `max_batch_interval`
-   4. `max_batch_rows`
-   5. `max_batch_size`
-   6. `jsonpaths`
-   7. `json_root`
-   8. `strip_outer_array`
-   9. `strict_mode`
-   10. `timezone`
-   11. `num_as_string`
-   12. `fuzzy_parse`
-   13. `partial_columns`
-   14. `max_filter_ratio`
-
-
-4. `data_source`
-
-   The type of data source. Currently supports:
-
-   KAFKA
-
-5. `data_source_properties`
-
-   Relevant properties of the data source. Currently only supports:
-
-   1. `kafka_partitions`
-   2. `kafka_offsets`
-   3. `kafka_broker_list`
-   4. `kafka_topic`
-   5. Custom properties, such as `property.group.id`
-
-   Note:
-
-   1. `kafka_partitions` and `kafka_offsets` are used to modify the offset of 
the kafka partition to be consumed, only the currently consumed partition can 
be modified. Cannot add partition.
-
-## Example
-
-1. Change `desired_concurrent_number` to 1
-
-   ```sql
-   ALTER ROUTINE LOAD FOR db1.label1
-   PROPERTIES
-   (
-       "desired_concurrent_number" = "1"
-   );
-   ```
-
-2. Modify `desired_concurrent_number` to 10, modify the offset of the 
partition, and modify the group id.
-
-   ```sql
-   ALTER ROUTINE LOAD FOR db1.label1
-   PROPERTIES
-   (
-       "desired_concurrent_number" = "10"
-   )
-   FROM kafka
-   (
-       "kafka_partitions" = "0, 1, 2",
-       "kafka_offsets" = "100, 200, 100",
-       "property.group.id" = "new_group"
-   );
-   ```
-
-## Keywords
-
-    ALTER, ROUTINE, LOAD
-
-## Best Practice
-
+## Required Parameters
+
+**1. `[db.]job_name`**
+
+> Specifies the name of the job to be modified. The identifier must begin with 
a letter character and cannot contain spaces or special characters unless the 
entire identifier string is enclosed in backticks.
+>
+> The identifier cannot use reserved keywords. For more details, please refer 
to identifier requirements and reserved keywords.
+
+**2. `job_properties`**

Review Comment:
   ```suggestion
   **2. `<job_properties>`**
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/PAUSE-ROUTINE-LOAD.md:
##########
@@ -25,31 +25,51 @@ under the License.
 -->
 
 
-## Description
+## Description## Description
 
-Used to pause a Routine Load job. A suspended job can be rerun with the RESUME 
command.
+This syntax is used to pause one or all Routine Load jobs. Paused jobs can be 
restarted using the RESUME command.
+
+## Syntax
 
 ```sql
-PAUSE [ALL] ROUTINE LOAD FOR job_name
+PAUSE [<ALL>] ROUTINE LOAD FOR <job_name>
 ```
 
-## Example
+## Required Parameters
+
+**1. `job_name`**

Review Comment:
   ```suggestion
   **1. `<job_name>`**
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/PAUSE-ROUTINE-LOAD.md:
##########
@@ -25,31 +25,51 @@ under the License.
 -->
 
 
-## Description
+## Description## Description
 
-Used to pause a Routine Load job. A suspended job can be rerun with the RESUME 
command.
+This syntax is used to pause one or all Routine Load jobs. Paused jobs can be 
restarted using the RESUME command.
+
+## Syntax
 
 ```sql
-PAUSE [ALL] ROUTINE LOAD FOR job_name
+PAUSE [<ALL>] ROUTINE LOAD FOR <job_name>

Review Comment:
   关键字而不是参数,不需要加尖括号
   ```suggestion
   PAUSE [ALL] ROUTINE LOAD FOR <job_name>
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/PAUSE-ROUTINE-LOAD.md:
##########
@@ -25,31 +25,51 @@ under the License.
 -->
 
 
-## Description
+## Description## Description

Review Comment:
   这里粘贴错了?



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/CREATE-ROUTINE-LOAD.md:
##########
@@ -27,357 +27,381 @@ under the License.
 
 ## Description
 
-The Routine Load function allows users to submit a resident import task, and 
import data into Doris by continuously reading data from a specified data 
source.
+The Routine Load feature allows users to submit a resident import task that 
continuously reads data from a specified data source and imports it into Doris.
 
-Currently, only data in CSV or Json format can be imported from Kakfa through 
unauthenticated or SSL authentication. [Example of importing data in Json 
format](../../../../data-operate/import/import-way/routine-load-manual.md#Example_of_importing_data_in_Json_format)
+Currently, it only supports importing CSV or Json format data from Kafka 
through unauthenticated or SSL authentication methods. [Example of importing 
Json format 
data](../../../../data-operate/import/import-way/routine-load-manual.md#Example-of-importing-Json-format-data)
 
-grammar:
+## Syntax
 
 ```sql
-CREATE ROUTINE LOAD [db.]job_name [ON tbl_name]
-[merge_type]
-[load_properties]
-[job_properties]
-FROM data_source [data_source_properties]
-[COMMENT "comment"]
+CREATE ROUTINE LOAD [<db.>]<job_name> [ON <tbl_name>]
+[<merge_type>]
+[<load_properties>]
+[<job_properties>]
+FROM <data_source> [<data_source_properties>]
+[<COMMENT "comment">]

Review Comment:
   ```suggestion
   [COMMENT "<comment>"]
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/ALTER-ROUTINE-LOAD.md:
##########
@@ -27,98 +27,96 @@ under the License.
 
 ## Description
 
-This syntax is used to modify an already created routine import job.
+This syntax is used to modify an existing routine load job. Only jobs in 
PAUSED state can be modified.
 
-Only jobs in the PAUSED state can be modified.
-
-grammar:
+## Syntax
 
 ```sql
-ALTER ROUTINE LOAD FOR [db.]job_name
-[job_properties]
-FROM data_source
-[data_source_properties]
+ALTER ROUTINE LOAD FOR [<db.>]<job_name>

Review Comment:
   ```suggestion
   ALTER ROUTINE LOAD FOR [<db>.]<job_name>
   ```



##########
docs/sql-manual/sql-statements/data-modification/load-and-export/CREATE-ROUTINE-LOAD.md:
##########
@@ -27,357 +27,381 @@ under the License.
 
 ## Description
 
-The Routine Load function allows users to submit a resident import task, and 
import data into Doris by continuously reading data from a specified data 
source.
+The Routine Load feature allows users to submit a resident import task that 
continuously reads data from a specified data source and imports it into Doris.
 
-Currently, only data in CSV or Json format can be imported from Kakfa through 
unauthenticated or SSL authentication. [Example of importing data in Json 
format](../../../../data-operate/import/import-way/routine-load-manual.md#Example_of_importing_data_in_Json_format)
+Currently, it only supports importing CSV or Json format data from Kafka 
through unauthenticated or SSL authentication methods. [Example of importing 
Json format 
data](../../../../data-operate/import/import-way/routine-load-manual.md#Example-of-importing-Json-format-data)
 
-grammar:
+## Syntax
 
 ```sql
-CREATE ROUTINE LOAD [db.]job_name [ON tbl_name]
-[merge_type]
-[load_properties]
-[job_properties]
-FROM data_source [data_source_properties]
-[COMMENT "comment"]
+CREATE ROUTINE LOAD [<db.>]<job_name> [ON <tbl_name>]

Review Comment:
   ```suggestion
   CREATE ROUTINE LOAD [<db>.]<job_name> [ON <tbl_name>]
   ```



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to