morrySnow commented on code in PR #1998:
URL: https://github.com/apache/doris-website/pull/1998#discussion_r1947458065


##########
docs/sql-manual/sql-statements/cluster-management/storage-management/ALTER-STORAGE-POLICY.md:
##########
@@ -26,29 +26,42 @@ under the License.
 
 ## Description
 
-This statement is used to modify an existing cold and hot separation migration 
strategy. Only root or admin users can modify resources.
+This statement is used to modify an existing hot-cold tiered migration policy. 
Only root or admin users can modify resources.
 
+## Syntax
 ```sql
-ALTER STORAGE POLICY  'policy_name'
-PROPERTIES ("key"="value", ...);
+ALTER STORAGE POLICY  '<policy_name>' PROPERTIE ("<key>"="<value>"[, ... ]);
 ```
 
-## Example
+## Required Parameters
+| Parameter Name          | Description                                        
                 |
+|-------------------|--------------------------------------------------------------|
+| `<policy_name>`   |  The name of the storage policy. This is the unique 
identifier of the storage policy you want to modify, and an existing policy 
name must be specified. |
 
-1. Modify the name to coolown_datetime Cold and hot separation data migration 
time point:
+## Optional Parameters
+
+| 参数名称          | 描述                                                         |

Review Comment:
   需要变成英文



##########
docs/sql-manual/sql-statements/cluster-management/storage-management/CREATE-STORAGE-VAULT.md:
##########
@@ -26,65 +26,71 @@ specific language governing permissions and limitations
 under the License.
 -->
 
-## CREATE-STORAGE-VAULT
+## Description
 
-### Description
+This command is used to create a storage vault. The topic of this document 
describes the syntax for creating a self-managed storage vault in Doris.
 
-This command is used to create a storage vault. The subject of this document 
describes the syntax for creating Doris self-maintained storage vault.
+
+## Syntax
 
 ```sql
-CREATE STORAGE VAULT [IF NOT EXISTS] vault
-[properties]
+CREATE STORAGE VAULT [IF NOT EXISTS] <`vault_name`> [ <`properties`> ]
 ```
 
-
-#### properties
-
-| param  | is required | desc                                                  
 |
-|:-------|:------------|:-------------------------------------------------------|
-| `type` | required    | Only two types of vaults are allowed: `S3` and 
`HDFS`. |
-
-##### S3 Vault
-
-| param           | is required | desc                                         
                                                                                
                                                                                
      |
-|:----------------|:------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| `s3.endpoint`    | required    | The endpoint used for object storage. 
<br/>**Notice**, please don't provide the endpoint with any `http://` or 
`https://`. And for Azure Blob Storage, the endpoint should be 
`blob.core.windows.net`. |
-| `s3.region`      | required    | The region of your bucket.(Not required 
when you'r using GCP or AZURE).                                                 
                                                                              |
-| `s3.root.path`   | required    | The path where the data would be stored.    
                                                                                
                                            |
-| `s3.bucket`      | required    | The bucket of your object storage account. 
(StorageAccount if you're using Azure).                                         
                                                                                
       |
-| `s3.access_key`  | required    | The access key of your object storage 
account. (AccountName if you're using Azure).                                   
                                                                                
             |
-| `s3.secret_key`  | required    | The secret key of your object storage 
account. (AccountKey if you're using Azure).                                    
                                                                                
            |
-| `provider`       | required    | The cloud vendor which provides the object 
storage service. The supported values include `COS`, `OSS`, `S3`, `OBS`, `BOS`, 
`AZURE`, `GCP`                                                                  
                                                              |
-| `use_path_style` | optional    | Indicate using `path-style URL`(private 
environment recommended) or `virtual-hosted-style URL`(public cloud 
recommended), default `true` (`path-style`)                                     
                                                                          |
-
-##### HDFS Vault
-
-| param                            | is required | desc                        
                                                                                
                                                 |
-|:---------------------------------|:------------|:-------------------------------------------------------------------------------------------------------------------------------------------------------------|
-| `fs.defaultFS`                   | required    | Hadoop configuration 
property that specifies the default file system to use.                         
                                                        |
-| `path_prefix`                    | optional    | The path prefix to where 
the data would be stored. It would be the root_path of your Hadoop user if you 
don't provide any prefix.                            |
-| `hadoop.username`                | optional    | Hadoop configuration 
property that specifies the user accessing the file system. It would be the 
user starting Hadoop process if you don't provide any user. |
-| `hadoop.security.authentication` | optional    | The authentication way used 
for hadoop. If you'd like to use kerberos you can provide with `kerboros`.      
                                                 |
-| `hadoop.kerberos.principal`      | optional    | The path to your kerberos 
principal.                                                       |
-| `hadoop.kerberos.keytab`         | optional    | The path to your kerberos 
keytab.                                                       |
-
-### Example
-
-1. create a HDFS storage vault.
+## Required Parameters
+
+| Parameter     | Description                     |
+|-------|-----------------------|
+| `vault_name` |  The name of the storage vault. This is the unique identifier 
for the new storage vault you are creating. |
+
+## Optional Parameters
+| Parameter   | Description                                                    
     |
+|-------------------|--------------------------------------------------------------|
+| `[IF NOT EXISTS]` | If the specified storage vault already exists, the 
creation operation will not be executed, and no error will be thrown. This 
prevents duplicate creation of the same storage vault. |
+| `PROPERTIES`      | A set of key-value pairs used to set or update specific 
properties of the storage vault. Each property consists of a key (<key>) and a 
value (<value>), separated by an equals sign (=). Multiple key-value pairs are 
separated by commas (,). |
+
+### S3 Vault
+
+| Parameter              | Required | Description                              
                                                                        |
+|:----------------|:-----|:--------------------------------------------------------------------------------------------------------|
+| `s3.endpoint`    | Required   | The endpoint for object storage.
+Note: Do not provide a link starting with http:// or https://. For Azure Blob 
Storage, the endpoint is fixed as blob.core.windows.net.。 |
+| `s3.region`      | Required   | The region of your storage bucket. (Not 
required if using GCP or AZURE). |
+| `s3.root.path`   | Required   | The path to store data. |
+| `s3.bucket`      | Required   | The bucket of your object storage account. 
(For Azure, this is the StorageAccount). |
+| `s3.access_key`  | Required   | The access key for your object storage 
account. (For Azure, this is the AccountName). |
+| `s3.secret_key`  | Required   | The secret key for your object storage 
account. (For Azure, this is the AccountKey). |
+| `provider`       | Required   | The cloud provider offering the object 
storage service. Supported values are 
`COS`,`OSS`,`S3`,`OBS`,`BOS`,`AZURE`,`GCP` |
+| `use_path_style` | Optional   | Use `path-style URL (for private deployment 
environments) or `virtual-hosted-style URL`(recommended for public cloud 
environments). Default value is true (path-style).                              
                                                     |
+
+### HDFS vault
+
+| Parameter                               | Required | Description             
                                       |
+|:---------------------------------|:-----|:------------------------------------------------------|
+| `fs.defaultFS`                   |Required| Hadoop configuration property 
specifying the default file system to use.                             |
+| `path_prefix`                    |Optional| The prefix path for storing 
data. If not specified, the default path under the user account will be used.   
                |
+| `hadoop.username`                |Optional| Hadoop configuration property 
specifying the user to access the file system. If not specified, the user who 
started the Hadoop process will be used. |
+| `hadoop.security.authentication` |Optional| The authentication method for 
Hadoop. If you want to use Kerberos, you can specify kerberos.      |
+| `hadoop.kerberos.principal`      |Optional| The path to your Kerberos 
principal.      |
+| `hadoop.kerberos.keytab`         |Optional| The path to your Kerberos 
keytab.     |
+
+## Examples
+
+1. Create HDFS storage vault。
     ```sql
     CREATE STORAGE VAULT IF NOT EXISTS hdfs_vault_demo
     PROPERTIES (
         "type" = "hdfs",                                     -- required
         "fs.defaultFS" = "hdfs://127.0.0.1:8020",            -- required
-        "path_prefix" = "big/data",                          -- optional
+        "path_prefix" = "big/data",                          -- optional,  
一般按照业务名称填写

Review Comment:
   有中文,把所有文档都检查一下吧,英文文档不要出现中文



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org
For additional commands, e-mail: commits-h...@doris.apache.org

Reply via email to