This is an automated email from the ASF dual-hosted git repository. morningman pushed a commit to branch master in repository https://gitbox.apache.org/repos/asf/doris-website.git
The following commit(s) were added to refs/heads/master by this push: new 5f37e3140a [fix](load) fix broker load parallel instance docs (#1298) 5f37e3140a is described below commit 5f37e3140a594f0a8192918070766904f69fdc83 Author: Kaijie Chen <c...@apache.org> AuthorDate: Thu Nov 7 22:09:46 2024 +0800 [fix](load) fix broker load parallel instance docs (#1298) # Versions - [x] dev - [x] 3.0 - [x] 2.1 - [x] 2.0 # Languages - [x] Chinese - [x] English --- docs/data-operate/import/import-way/broker-load-manual.md | 2 +- .../current/data-operate/import/import-way/broker-load-manual.md | 2 +- .../version-1.2/data-operate/import/import-way/broker-load-manual.md | 2 +- .../version-2.0/data-operate/import/broker-load-manual.md | 2 +- .../version-2.1/data-operate/import/import-way/broker-load-manual.md | 2 +- .../version-3.0/data-operate/import/import-way/broker-load-manual.md | 2 +- versioned_docs/version-2.0/data-operate/import/broker-load-manual.md | 2 +- .../version-2.1/data-operate/import/import-way/broker-load-manual.md | 2 +- .../version-3.0/data-operate/import/import-way/broker-load-manual.md | 2 +- 9 files changed, 9 insertions(+), 9 deletions(-) diff --git a/docs/data-operate/import/import-way/broker-load-manual.md b/docs/data-operate/import/import-way/broker-load-manual.md index 702357fc0b..1261d617e3 100644 --- a/docs/data-operate/import/import-way/broker-load-manual.md +++ b/docs/data-operate/import/import-way/broker-load-manual.md @@ -665,7 +665,7 @@ Typically, the maximum supported data volume for an import job is `max_bytes_per - The minimum processed data volume, maximum concurrency, size of the source file, and the current number of BE nodes jointly determine the concurrency of this import. ```Plain -Import Concurrency = Math.min(Source File Size / Minimum Processing Amount, Maximum Concurrency, Current Number of BE Nodes) +Import Concurrency = Math.min(Source File Size / min_bytes_per_broker_scanner, max_broker_concurrency, Current Number of BE Nodes * load_parallelism) Processing Volume per BE for this Import = Source File Size / Import Concurrency ``` diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/import-way/broker-load-manual.md b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/import-way/broker-load-manual.md index 276360fcf1..22cd4957be 100644 --- a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/import-way/broker-load-manual.md +++ b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/import-way/broker-load-manual.md @@ -689,7 +689,7 @@ Broker Name 只是一个用户自定义名称,不代表 Broker 的类型。 - 最小处理的数据量,最大并发数,源文件的大小和当前集群 BE 的个数共同决定了本次导入的并发数。 ```Plain -本次导入并发数 = Math.min(源文件大小/最小处理量,最大并发数,当前BE节点个数) +本次导入并发数 = Math.min(源文件大小/min_bytes_per_broker_scanner,max_broker_concurrency,当前BE节点个数 * load_parallelism) 本次导入单个BE的处理量 = 源文件大小/本次导入的并发数 ``` diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2/data-operate/import/import-way/broker-load-manual.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2/data-operate/import/import-way/broker-load-manual.md index 73b1c297c8..b0ed164642 100644 --- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2/data-operate/import/import-way/broker-load-manual.md +++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-1.2/data-operate/import/import-way/broker-load-manual.md @@ -309,7 +309,7 @@ Broker Load 需要借助 Broker 进程访问远端存储,不同的 Broker 需 前两个配置限制了单个 BE 处理的数据量的最小和最大值。第三个配置限制了一个作业的最大的导入并发数。最小处理的数据量,最大并发数,源文件的大小和当前集群 BE 的个数 **共同决定了本次导入的并发数**。 ```text - 本次导入并发数 = Math.min(源文件大小/最小处理量,最大并发数,当前BE节点个数) + 本次导入并发数 = Math.min(源文件大小/min_bytes_per_broker_scanner,max_broker_concurrency,当前BE节点个数 * load_parallelism) 本次导入单个BE的处理量 = 源文件大小/本次导入的并发数 ``` diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/data-operate/import/broker-load-manual.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/data-operate/import/broker-load-manual.md index 95037dfb71..39cd63163c 100644 --- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/data-operate/import/broker-load-manual.md +++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.0/data-operate/import/broker-load-manual.md @@ -689,7 +689,7 @@ Broker Name 只是一个用户自定义名称,不代表 Broker 的类型。 - 最小处理的数据量,最大并发数,源文件的大小和当前集群 BE 的个数共同决定了本次导入的并发数。 ```Plain -本次导入并发数 = Math.min(源文件大小/最小处理量,最大并发数,当前BE节点个数) +本次导入并发数 = Math.min(源文件大小/min_bytes_per_broker_scanner,max_broker_concurrency,当前BE节点个数 * load_parallelism) 本次导入单个BE的处理量 = 源文件大小/本次导入的并发数 ``` diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/import-way/broker-load-manual.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/import-way/broker-load-manual.md index 276360fcf1..22cd4957be 100644 --- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/import-way/broker-load-manual.md +++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-2.1/data-operate/import/import-way/broker-load-manual.md @@ -689,7 +689,7 @@ Broker Name 只是一个用户自定义名称,不代表 Broker 的类型。 - 最小处理的数据量,最大并发数,源文件的大小和当前集群 BE 的个数共同决定了本次导入的并发数。 ```Plain -本次导入并发数 = Math.min(源文件大小/最小处理量,最大并发数,当前BE节点个数) +本次导入并发数 = Math.min(源文件大小/min_bytes_per_broker_scanner,max_broker_concurrency,当前BE节点个数 * load_parallelism) 本次导入单个BE的处理量 = 源文件大小/本次导入的并发数 ``` diff --git a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/import-way/broker-load-manual.md b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/import-way/broker-load-manual.md index 276360fcf1..22cd4957be 100644 --- a/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/import-way/broker-load-manual.md +++ b/i18n/zh-CN/docusaurus-plugin-content-docs/version-3.0/data-operate/import/import-way/broker-load-manual.md @@ -689,7 +689,7 @@ Broker Name 只是一个用户自定义名称,不代表 Broker 的类型。 - 最小处理的数据量,最大并发数,源文件的大小和当前集群 BE 的个数共同决定了本次导入的并发数。 ```Plain -本次导入并发数 = Math.min(源文件大小/最小处理量,最大并发数,当前BE节点个数) +本次导入并发数 = Math.min(源文件大小/min_bytes_per_broker_scanner,max_broker_concurrency,当前BE节点个数 * load_parallelism) 本次导入单个BE的处理量 = 源文件大小/本次导入的并发数 ``` diff --git a/versioned_docs/version-2.0/data-operate/import/broker-load-manual.md b/versioned_docs/version-2.0/data-operate/import/broker-load-manual.md index b54df0a4b8..8a55167790 100644 --- a/versioned_docs/version-2.0/data-operate/import/broker-load-manual.md +++ b/versioned_docs/version-2.0/data-operate/import/broker-load-manual.md @@ -665,7 +665,7 @@ Typically, the maximum supported data volume for an import job is `max_bytes_per - The minimum processed data volume, maximum concurrency, size of the source file, and the current number of BE nodes jointly determine the concurrency of this import. ```Plain -Import Concurrency = Math.min(Source File Size / Minimum Processing Amount, Maximum Concurrency, Current Number of BE Nodes) +Import Concurrency = Math.min(Source File Size / min_bytes_per_broker_scanner, max_broker_concurrency, Current Number of BE Nodes * load_parallelism) Processing Volume per BE for this Import = Source File Size / Import Concurrency ``` diff --git a/versioned_docs/version-2.1/data-operate/import/import-way/broker-load-manual.md b/versioned_docs/version-2.1/data-operate/import/import-way/broker-load-manual.md index 702357fc0b..1261d617e3 100644 --- a/versioned_docs/version-2.1/data-operate/import/import-way/broker-load-manual.md +++ b/versioned_docs/version-2.1/data-operate/import/import-way/broker-load-manual.md @@ -665,7 +665,7 @@ Typically, the maximum supported data volume for an import job is `max_bytes_per - The minimum processed data volume, maximum concurrency, size of the source file, and the current number of BE nodes jointly determine the concurrency of this import. ```Plain -Import Concurrency = Math.min(Source File Size / Minimum Processing Amount, Maximum Concurrency, Current Number of BE Nodes) +Import Concurrency = Math.min(Source File Size / min_bytes_per_broker_scanner, max_broker_concurrency, Current Number of BE Nodes * load_parallelism) Processing Volume per BE for this Import = Source File Size / Import Concurrency ``` diff --git a/versioned_docs/version-3.0/data-operate/import/import-way/broker-load-manual.md b/versioned_docs/version-3.0/data-operate/import/import-way/broker-load-manual.md index 702357fc0b..1261d617e3 100644 --- a/versioned_docs/version-3.0/data-operate/import/import-way/broker-load-manual.md +++ b/versioned_docs/version-3.0/data-operate/import/import-way/broker-load-manual.md @@ -665,7 +665,7 @@ Typically, the maximum supported data volume for an import job is `max_bytes_per - The minimum processed data volume, maximum concurrency, size of the source file, and the current number of BE nodes jointly determine the concurrency of this import. ```Plain -Import Concurrency = Math.min(Source File Size / Minimum Processing Amount, Maximum Concurrency, Current Number of BE Nodes) +Import Concurrency = Math.min(Source File Size / min_bytes_per_broker_scanner, max_broker_concurrency, Current Number of BE Nodes * load_parallelism) Processing Volume per BE for this Import = Source File Size / Import Concurrency ``` --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@doris.apache.org For additional commands, e-mail: commits-h...@doris.apache.org