This is an automated email from the ASF dual-hosted git repository.

liaoxin pushed a commit to branch master
in repository https://gitbox.apache.org/repos/asf/doris-website.git


The following commit(s) were added to refs/heads/master by this push:
     new 446ab03077b Add AWS MSK Routine Load docs in EN/ZH/JA (#3520)
446ab03077b is described below

commit 446ab03077bc2294b4f7217de59053e8aaf1e1df
Author: 黄瑞鑫 <[email protected]>
AuthorDate: Fri Apr 10 11:27:29 2026 +0800

    Add AWS MSK Routine Load docs in EN/ZH/JA (#3520)
---
 docs/data-operate/import/data-source/aws-msk.md    | 170 ++++++++++++++++++++
 .../data-operate/import/data-source/aws-msk.md     | 171 +++++++++++++++++++++
 .../data-operate/import/data-source/aws-msk.md     | 170 ++++++++++++++++++++
 sidebars.ts                                        |   1 +
 4 files changed, 512 insertions(+)

diff --git a/docs/data-operate/import/data-source/aws-msk.md 
b/docs/data-operate/import/data-source/aws-msk.md
new file mode 100644
index 00000000000..c6fbf7309ab
--- /dev/null
+++ b/docs/data-operate/import/data-source/aws-msk.md
@@ -0,0 +1,170 @@
+---
+{
+    "title": "AWS MSK",
+    "language": "en",
+    "description": "Doris provides Routine Load to import data from AWS MSK"
+}
+---
+
+Amazon Managed Streaming for Apache Kafka (AWS MSK) is a fully managed Apache 
Kafka service provided by AWS. Similar to consuming from Kafka directly, Doris 
supports real-time data import from AWS MSK through Routine Load with IAM-based 
authentication. CSV and JSON formats are supported, with Exactly-Once semantics 
to ensure no data loss and no duplication. For more information, see [Routine 
Load](../import-way/routine-load-manual.md).
+
+## Authentication Parameters
+
+| Parameter | Description | Example |
+| :--- | :--- | :--- |
+| aws.region | AWS Region | "us-east-1" |
+| aws.access_key | AWS Access Key ID | \ |
+| aws.secret_key | AWS Secret Access Key | \ |
+| aws.role_arn | Role used for cross-account access | 
"arn:aws:iam::123456789012:role/MyRole" |
+| aws.profile_name | AWS profile name configured in `~/.aws/credentials` | \ |
+| aws.credentials_provider | Standard AWS SDK credentials provider, supporting 
multiple provider types | "INSTANCEPROFILE" |
+| aws.external_id | A "caller context identifier" for AssumeRole | \ |
+| property.security.protocol | Due to IAM authentication requirements, this 
must be `SASL_SSL` | "SASL_SSL" |
+| property.sasl.mechanism | Due to librdkafka constraints, this must be 
`OAUTHBEARER` | "OAUTHBEARER" |
+
+## Usage Restrictions
+
+1. The AWS MSK cluster is created and IAM authentication is enabled.
+2. Proper AWS IAM permissions are configured to allow access to the MSK 
cluster.
+3. The Doris cluster can access the AWS MSK bootstrap servers.
+
+## Authentication Configuration
+
+Doris supports the following IAM authentication modes:
+
+### 1. Use Access Key and Secret Key (AK/SK) directly
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.access_key" = "<your-ak>",
+    "aws.secret_key" = "<your-sk>",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 2. IAM Role (Assume Role) mode
+
+When `aws.role_arn` is configured, `aws.credentials_provider` specifies which 
source credential provider is used for the STS AssumeRole call.
+
+**Example 1: Use EC2 Instance Profile as STS source credentials**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**Example 2: Use AK/SK from environment variables as STS source credentials**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "ENV",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**Example 3: Use the default provider chain as STS source credentials**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "DEFAULT",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 3. Specify credential source through aws.credentials_provider
+
+This mode applies when AK/SK is not explicitly provided, such as using EC2 
Instance Profile.
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+Available `aws.credentials_provider` values:
+
+| Parameter | Description |
+| :--- | :--- |
+| DEFAULT | Use the default provider chain |
+| ENV | Read credentials from environment variables |
+| INSTANCE_PROFILE | Use EC2 Instance Profile credentials |
+
+### Precedence Rules When Configuring Multiple Options
+
+1. If both `aws.access_key` and `aws.secret_key` are configured, AK/SK is used 
first.
+2. If AK/SK is not configured but `aws.role_arn` is configured, IAM Role is 
used. In this case, `aws.credentials_provider` is used to choose the STS source 
credentials.
+3. If neither AK/SK nor `aws.role_arn` is configured, 
`aws.credentials_provider` directly determines which provider the AWS client 
uses.
+
+## Public Internet Access
+
+For users who need to access AWS MSK over the public internet, if AWS 
authentication issues occur during data import, troubleshoot step by step using 
the guidance below.
+
+1. Ensure public access is enabled for the MSK cluster.
+In the AWS MSK console, select your cluster and check **Properties** > 
**Networking settings** > **Edit public access settings**. Ensure public access 
is enabled.
+2. Ensure the subnet is public.
+The subnet associated with the cluster must be public. In the AWS VPC console, 
make sure the subnet route table includes a `0.0.0.0/0 -> igw-xxxx` entry.
+3. Use the correct public bootstrap endpoints.
+In the AWS MSK console, select the cluster and click **View client 
information**. Ensure `kafka_broker_list` in your Routine Load job uses 
**public endpoints** instead of private endpoints.
+4. Ensure security group inbound/outbound rules are correct.
+Check the MSK security group inbound rules and verify that **port 9198** is 
properly opened to the required source IP ranges (when communicating with 
brokers through IAM access control, port 9198 must be publicly reachable).
+
+For more details, see AWS documentation:
+- [How to safely access an Amazon Managed Streaming for Apache Kafka (Amazon 
MSK) cluster over the 
internet](https://aws.amazon.com/cn/blogs/china/how-to-safely-access-amazon-managed-streaming-for-apache-kafka-amazon-msk-cluster-through-the-internet-i/)
+- [Access from within AWS but outside cluster's 
VPC](https://docs.aws.amazon.com/msk/latest/developerguide/aws-access.html)
+- [Enable internet access for your VPC using an internet 
gateway](https://docs.aws.amazon.com/vpc/latest/userguide/VPC_Internet_Gateway.html)
diff --git 
a/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
new file mode 100644
index 00000000000..977994b26bf
--- /dev/null
+++ 
b/i18n/zh-CN/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
@@ -0,0 +1,171 @@
+---
+{
+    "title": "AWS MSK",
+    "language": "zh-CN",
+    "description": "Doris 提供RoutineLoad的方式从AWS MSK中导入数据"
+}
+---
+
+Amazon Managed Streaming for Apache Kafka (AWS MSK) 是 AWS 提供的完全托管的 Apache 
Kafka 服务。因此与消费Kafka类似,Doris 支持通过 Routine Load 从 AWS MSK 实时导入数据,提供 AWS MSK 的 IAM 
身份验证机制。支持 CSV 和 JSON 格式,具备 Exactly-Once 语义,确保数据不丢失且不重复。更多信息请参考 Routine Load。
+
+## 认证参数
+
+| 参数名                                      | 说明                                
   | 示例 |
+| :-------------------------------------------- | 
:----------------------------------------- | ----------------------- |
+| aws.region | AWS Region | "us-east-1" |
+| aws.access_key | AWS Access Key ID | \ |
+| aws.secret_key | AWS Secret Access Key | \ |
+| aws.role_arn | 跨账号访问凭证role | "arn:aws:iam::123456789012:role/MyRole" |
+| aws.profile_name | Aws profile名称,在~/.aws/credentials中配置 | \ |
+| aws.credentials_provider | AWS SDK 的标准凭证提供者,支持各种提供者类型 | "INSTANCEPROFILE" |
+| aws.external_id | 作为AssumeRole的一个“调用上下文标识” | \ |
+| property.security.protocol | 由于IAM认证限制,固定填写SASL_SSL | "SASL_SSL" |
+| property.sasl.mechanism | 由于librdkafka库限制,固定填写OAUTHBEARER | "OAUTHBEARER" |
+
+
+## 使用限制
+
+1. AWS MSK 集群已创建并启用 IAM 身份验证
+2. 已配置适当的 AWS IAM 权限,允许访问 MSK 集群
+3. Doris 集群能够访问 AWS MSK 的 Bootstrap Servers
+
+## 认证配置
+
+Doris 支持以下几种方式进行IAM认证:
+
+### 1. 直接使用 Access Key 和 Secret Key(AK/SK)
+
+```SQL
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+    
+    "aws.region" = "us-west-1",
+    "aws.access_key" = "<your-ak>",
+    "aws.secret_key" = "<your-sk>",
+    
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 2. IAM Role(Assume Role)模式
+
+当配置了 aws.role_arn 时,aws.credentials_provider 用于指定 STS AssumeRole 调用所使用的源凭证 
provider:
+
+**示例 1:EC2 Instance Profile 作为 STS 源凭证**
+
+```SQL
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+    
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+    
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**示例 2:间接从环境变量读取aksk 作为 STS 源凭证**
+
+```SQL
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+    
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "ENV",
+    
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**示例 3:使用默认 provider chain 作为 STS 源凭证**
+
+```SQL
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+    
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "DEFAULT",
+    
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 3. 通过aws.credentials_provider指定凭证来源
+
+适用于不显式填写 AK/SK 的场景,例如 EC2 Instance Profile。
+
+```SQL
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+    
+    "aws.region" = "us-west-1",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+    
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+aws.credentials_provider可选值:
+
+| 参数名                                      | 说明                                
   |
+| :-------------------------------------------- | 
:----------------------------------------- |
+| DEFAULT | 使用默认 provider chain |
+| ENV | 从环境变量读取凭证 |
+| INSTANCE_PROFILE | 使用 EC2 Instance Profile 凭证 | 
+
+
+### 同时配置时的生效规则
+
+1. 同时配置 aws.access_key 和 aws.secret_key 时,优先使用 AK/SK。
+2. 未配置 AK/SK 且配置了 aws.role_arn 时,使用 IAM Role;此时aws.credentials_provider 用于 STS 
源凭证选择。
+3. 未配置 AK/SK 且未配置 aws.role_arn 时,aws.credentials_provider 直接决定 AWS 客户端使用的 
provider。
+
+## 公网访问
+
+对于那些希望从公网环境访问AWS MSK的用户,如果在数据导入过程中出现AWS认证的问题,可按下面的文档一步一步排查问题。
+1. 确保MSK集群启用了公共访问权限
+在AWS MSK控制台中,选择访问的集群,查看**属性**中的**联网设置**:**编辑公共访问权限**,确保公共访问权限一栏是打开的。
+2. 确保子网公开
+与集群关联的子网必须是公开的。在AWS VPC控制台中,确保子网的路由表项包含0.0.0.0/0:igw-xxxx表项。
+3. 使用正确的Bootstrap公共端点
+在AWS MSK控制台中,选择所访问的集群,点击**查看客户端信息**,确保创建Routineload Load时的kafka_broker_list 
属性值填写的是**公共端点**而不是**私有端点**。
+4. 确保安全组配置正确的出入站规则
+查看MSK配置的安全组**入站规则**,是否为**端口**9198(如果是通过IAM 
访问控制与Broker进行通信,需要通过9198端口公开访问)配置了合适的源ip
+
+更详细的信息可以参考AWS相关文档:
+- [如何通过互联网安全地访问Amazon Managed Streaming for Apache Kafka (Amazon MSK) 
集群](https://aws.amazon.com/cn/blogs/china/how-to-safely-access-amazon-managed-streaming-for-apache-kafka-amazon-msk-cluster-through-the-internet-i/)
+- [Access from within AWS but outside cluster's 
VPC](https://docs.aws.amazon.com/msk/latest/developerguide/aws-access.html)
+- [使用互联网网关为 VPC 
启用互联网访问](https://docs.aws.amazon.com/zh_cn/vpc/latest/userguide/VPC_Internet_Gateway.html)
diff --git 
a/ja-source/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
 
b/ja-source/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
new file mode 100644
index 00000000000..9eb16f0fa40
--- /dev/null
+++ 
b/ja-source/docusaurus-plugin-content-docs/current/data-operate/import/data-source/aws-msk.md
@@ -0,0 +1,170 @@
+---
+{
+  "title": "AWS MSK",
+  "language": "ja",
+  "description": "DorisはRoutine Loadを使用してAWS MSKからデータをインポートできます"
+}
+---
+
+Amazon Managed Streaming for Apache Kafka (AWS MSK) は、AWS が提供するフルマネージドの Apache 
Kafka サービスです。Kafka を直接消費する場合と同様に、Doris は Routine Load を使用して AWS MSK 
からリアルタイムにデータをインポートでき、IAM ベースの認証に対応しています。CSV および JSON 形式をサポートし、Exactly-Once 
セマンティクスによりデータの欠損や重複を防ぎます。詳細は [Routine 
Load](../import-way/routine-load-manual.md) を参照してください。
+
+## 認証パラメータ
+
+| パラメータ名 | 説明 | 例 |
+| :--- | :--- | :--- |
+| aws.region | AWS Region | "us-east-1" |
+| aws.access_key | AWS Access Key ID | \ |
+| aws.secret_key | AWS Secret Access Key | \ |
+| aws.role_arn | クロスアカウントアクセス用のロール | "arn:aws:iam::123456789012:role/MyRole" |
+| aws.profile_name | `~/.aws/credentials` に設定した AWS プロファイル名 | \ |
+| aws.credentials_provider | AWS SDK の標準クレデンシャルプロバイダー(複数タイプをサポート) | 
"INSTANCEPROFILE" |
+| aws.external_id | AssumeRole の「呼び出しコンテキスト識別子」 | \ |
+| property.security.protocol | IAM 認証の制約により `SASL_SSL` 固定 | "SASL_SSL" |
+| property.sasl.mechanism | librdkafka の制約により `OAUTHBEARER` 固定 | "OAUTHBEARER" 
|
+
+## 使用制限
+
+1. AWS MSK クラスターが作成済みで、IAM 認証が有効化されていること。
+2. MSK クラスターへアクセスできる適切な AWS IAM 権限が設定されていること。
+3. Doris クラスターから AWS MSK の Bootstrap Servers に接続できること。
+
+## 認証設定
+
+Doris は以下の IAM 認証方式をサポートします。
+
+### 1. Access Key と Secret Key(AK/SK)を直接使用
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.access_key" = "<your-ak>",
+    "aws.secret_key" = "<your-sk>",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 2. IAM Role(Assume Role)モード
+
+`aws.role_arn` を設定した場合、`aws.credentials_provider` は STS AssumeRole 
呼び出し時に使用するソースクレデンシャルプロバイダーを指定します。
+
+**例1: STS のソースクレデンシャルに EC2 Instance Profile を使用**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**例2: STS のソースクレデンシャルに環境変数の AK/SK を使用**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "ENV",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+**例3: STS のソースクレデンシャルにデフォルトプロバイダーチェーンを使用**
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.role_arn" = "arn:aws:iam::123456789012:role/demo-role",
+    "aws.credentials_provider" = "DEFAULT",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+### 3. aws.credentials_provider でクレデンシャルソースを指定
+
+AK/SK を明示的に設定しないケース(例: EC2 Instance Profile)で利用します。
+
+```sql
+CREATE ROUTINE LOAD IAM_Test ON t
+COLUMNS TERMINATED BY ",",
+COLUMNS(a,b)
+FROM KAFKA(
+    "kafka_broker_list" = "your_msk_broker_list",
+    "kafka_topic" = "your_kafka_topic",
+
+    "aws.region" = "us-west-1",
+    "aws.credentials_provider" = "INSTANCE_PROFILE",
+
+    "property.kafka_default_offsets" = "OFFSET_BEGINNING",
+    "property.security.protocol" = "SASL_SSL",
+    "property.sasl.mechanism" = "OAUTHBEARER"
+);
+```
+
+`aws.credentials_provider` の指定可能値:
+
+| パラメータ名 | 説明 |
+| :--- | :--- |
+| DEFAULT | デフォルトのプロバイダーチェーンを使用 |
+| ENV | 環境変数からクレデンシャルを取得 |
+| INSTANCE_PROFILE | EC2 Instance Profile のクレデンシャルを使用 |
+
+### 複数設定時の優先ルール
+
+1. `aws.access_key` と `aws.secret_key` を同時に設定した場合は AK/SK が優先されます。
+2. AK/SK が未設定で `aws.role_arn` が設定されている場合は IAM Role を使用します。このとき 
`aws.credentials_provider` は STS のソースクレデンシャル選択に使用されます。
+3. AK/SK と `aws.role_arn` のどちらも未設定の場合、`aws.credentials_provider` が AWS 
クライアントのプロバイダーを直接決定します。
+
+## パブリックインターネット経由のアクセス
+
+パブリックインターネット経由で AWS MSK にアクセスする必要がある場合、データインポート中に AWS 
認証エラーが発生したら、以下の手順で確認してください。
+
+1. MSK クラスターでパブリックアクセスが有効であることを確認する。  
+AWS MSK コンソールで対象クラスターを選択し、**Properties** > **Networking settings** > **Edit 
public access settings** を確認して、パブリックアクセスが有効になっていることを確認します。
+2. サブネットがパブリックであることを確認する。  
+クラスターに関連付けられたサブネットはパブリックである必要があります。AWS VPC コンソールで、サブネットのルートテーブルに `0.0.0.0/0 -> 
igw-xxxx` のエントリがあることを確認します。
+3. 正しい Bootstrap のパブリックエンドポイントを使用する。  
+AWS MSK コンソールで対象クラスターの **View client information** を開き、Routine Load 作成時の 
`kafka_broker_list` に **private endpoints** ではなく **public endpoints** 
を指定していることを確認します。
+4. セキュリティグループのインバウンド/アウトバウンドルールが正しいことを確認する。  
+MSK のセキュリティグループのインバウンドルールで、**ポート 9198**(IAM アクセス制御で Broker と通信する場合に必要)が適切な送信元 
IP 範囲に対して開放されていることを確認します。
+
+詳細は AWS ドキュメントを参照してください:
+- [How to safely access an Amazon Managed Streaming for Apache Kafka (Amazon 
MSK) cluster over the 
internet](https://aws.amazon.com/cn/blogs/china/how-to-safely-access-amazon-managed-streaming-for-apache-kafka-amazon-msk-cluster-through-the-internet-i/)
+- [Access from within AWS but outside cluster's 
VPC](https://docs.aws.amazon.com/msk/latest/developerguide/aws-access.html)
+- [Enable internet access for your VPC using an internet 
gateway](https://docs.aws.amazon.com/vpc/latest/userguide/VPC_Internet_Gateway.html)
diff --git a/sidebars.ts b/sidebars.ts
index b06df910879..5e00ea1dd89 100644
--- a/sidebars.ts
+++ b/sidebars.ts
@@ -162,6 +162,7 @@ const sidebars: SidebarsConfig = {
                             items: [
                                 'data-operate/import/data-source/local-file',
                                 'data-operate/import/data-source/kafka',
+                                'data-operate/import/data-source/aws-msk',
                                 'data-operate/import/data-source/flink',
                                 'data-operate/import/data-source/hdfs',
                                 'data-operate/import/data-source/amazon-s3',


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to