This is an automated email from the ASF dual-hosted git repository. shaofengshi pushed a commit to branch document in repository https://gitbox.apache.org/repos/asf/kylin.git
commit bacbc584f114b33efba0c5a44c4144a10235bad7 Author: GinaZhai <[email protected]> AuthorDate: Thu Aug 9 23:19:39 2018 +0800 KYLIN-3486 cube_streaming page command exception Signed-off-by: shaofengshi <[email protected]> --- website/_docs/tutorial/cube_spark.cn.md | 2 +- website/_docs/tutorial/cube_streaming.cn.md | 14 +++++++------- website/_docs/tutorial/cube_streaming.md | 6 +++--- website/_docs16/tutorial/cube_streaming.md | 6 +++--- website/_docs20/tutorial/cube_streaming.md | 6 +++--- website/_docs21/tutorial/cube_streaming.md | 6 +++--- website/_docs23/tutorial/cube_spark.cn.md | 2 +- website/_docs23/tutorial/cube_streaming.cn.md | 14 +++++++------- website/_docs23/tutorial/cube_streaming.md | 6 +++--- 9 files changed, 31 insertions(+), 31 deletions(-) diff --git a/website/_docs/tutorial/cube_spark.cn.md b/website/_docs/tutorial/cube_spark.cn.md index b1909cc..913be9f 100644 --- a/website/_docs/tutorial/cube_spark.cn.md +++ b/website/_docs/tutorial/cube_spark.cn.md @@ -138,7 +138,7 @@ Kylin 启动后,访问 Kylin 网站,在 "Advanced Setting" 页,编辑名 当出现 error,您可以首先查看 "logs/kylin.log". 其中包含 Kylin 执行的所有 Spark 命令,例如: {% highlight Groff markup %} -2017-03-06 14:44:38,574 INFO [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --conf spark.executor.instances=1 --conf spark.yarn.queue=default --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...] +2017-03-06 14:44:38,574 INFO [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --conf spark.executor.instances=1 --conf spark.yarn.queue=default --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...] {% endhighlight %} diff --git a/website/_docs/tutorial/cube_streaming.cn.md b/website/_docs/tutorial/cube_streaming.cn.md index 00d5429..2433554 100644 --- a/website/_docs/tutorial/cube_streaming.cn.md +++ b/website/_docs/tutorial/cube_streaming.cn.md @@ -14,7 +14,7 @@ Kylin v1.6 发布了可扩展的 streaming cubing 功能,它利用 Hadoop 消 ## 安装 Kafka 0.10.0.0 和 Kylin 不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果其运行着请先停掉。 {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -127,7 +127,7 @@ Streaming Cube 和普通的 cube 大致上一样. 有以下几点需要您注意 您可以在 web GUI 触发 build,通过点击 "Actions" -> "Build",或用 'curl' 命令发送一个请求到 Kylin RESTful API: {% highlight Groff markup %} -curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 {% endhighlight %} 请注意 API 终端和普通 cube 不一样 (这个 URL 以 "build2" 结尾)。 @@ -139,7 +139,7 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" ## 点击 "Insight" 标签,编写 SQL 运行,例如: {% highlight Groff markup %} -select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_table group by minute_start order by minute_start +select minute_start, count(*), sum(amount), sum(qty) from streaming_sales_table group by minute_start order by minute_start {% endhighlight %} 结果如下。 @@ -152,7 +152,7 @@ select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_tab {% highlight Groff markup %} crontab -e -*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 +*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 {% endhighlight %} 现在您可以观看 cube 从 streaming 中自动 built。当 cube segments 累积到更大的时间范围,Kylin 将会自动的将其合并到一个更大的 segment 中。 @@ -202,18 +202,18 @@ Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.producer.P * 如果 Kafka 里已经有一组历史 message 且您不想从最开始 build,您可以触发一个调用来将当前的结束位置设为 cube 的开始: {% highlight Groff markup %} -curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets {% endhighlight %} * 如果一些 build job 出错了并且您将其 discard,Cube 中就会留有一个洞(或称为空隙)。每一次 Kylin 都会从最后的位置 build,您不可期望通过正常的 builds 将洞填补。Kylin 提供了 API 检查和填补洞 检查洞: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} 如果查询结果是一个空的数组,意味着没有洞;否则,触发 Kylin 填补他们: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs/tutorial/cube_streaming.md b/website/_docs/tutorial/cube_streaming.md index 65d99d8..0989b04 100644 --- a/website/_docs/tutorial/cube_streaming.md +++ b/website/_docs/tutorial/cube_streaming.md @@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S ## Install Kafka 0.10.0.0 and Kylin Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running. {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" Check holes: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs16/tutorial/cube_streaming.md b/website/_docs16/tutorial/cube_streaming.md index 6909545..63d81eb 100644 --- a/website/_docs16/tutorial/cube_streaming.md +++ b/website/_docs16/tutorial/cube_streaming.md @@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S ## Install Kafka 0.10.0.0 and Kylin Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running. {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" Check holes: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs20/tutorial/cube_streaming.md b/website/_docs20/tutorial/cube_streaming.md index 08e5bf9..115d91a 100644 --- a/website/_docs20/tutorial/cube_streaming.md +++ b/website/_docs20/tutorial/cube_streaming.md @@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S ## Install Kafka 0.10.0.0 and Kylin Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running. {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" Check holes: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs21/tutorial/cube_streaming.md b/website/_docs21/tutorial/cube_streaming.md index fa96db5..dd5eba2 100644 --- a/website/_docs21/tutorial/cube_streaming.md +++ b/website/_docs21/tutorial/cube_streaming.md @@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S ## Install Kafka 0.10.0.0 and Kylin Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running. {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" Check holes: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs23/tutorial/cube_spark.cn.md b/website/_docs23/tutorial/cube_spark.cn.md index ca0dd99..f456a89 100644 --- a/website/_docs23/tutorial/cube_spark.cn.md +++ b/website/_docs23/tutorial/cube_spark.cn.md @@ -142,7 +142,7 @@ Kylin 启动后,访问 Kylin 网站,在 "Advanced Setting" 页,编辑名 当出现 error,您可以首先查看 "logs/kylin.log". 其中包含 Kylin 执行的所有 Spark 命令,例如: {% highlight Groff markup %} -2017-03-06 14:44:38,574 INFO [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --conf spark.executor.instances=1 --conf spark.yarn.queue=default --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...] +2017-03-06 14:44:38,574 INFO [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] spark.SparkExecutable:121 : cmd:export HADOOP_CONF_DIR=/usr/local/apache-kylin-2.1.0-bin-hbase1x/hadoop-conf && /usr/local/apache-kylin-2.1.0-bin-hbase1x/spark/bin/spark-submit --class org.apache.kylin.common.util.SparkEntry --conf spark.executor.instances=1 --conf spark.yarn.queue=default --conf spark.yarn.am.extraJavaOptions=-Dhdp.version=current --conf spark.history.fs.logDirectory=hdfs:///kylin/spark-his [...] {% endhighlight %} diff --git a/website/_docs23/tutorial/cube_streaming.cn.md b/website/_docs23/tutorial/cube_streaming.cn.md index cf094e9..99bfd96 100644 --- a/website/_docs23/tutorial/cube_streaming.cn.md +++ b/website/_docs23/tutorial/cube_streaming.cn.md @@ -14,7 +14,7 @@ Kylin v1.6 发布了可扩展的 streaming cubing 功能,它利用 Hadoop 消 ## 安装 Kafka 0.10.0.0 和 Kylin 不要使用 HDP 2.2.4 自带的 Kafka,因为它太旧了,如果其运行着请先停掉。 {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -127,7 +127,7 @@ Streaming Cube 和普通的 cube 大致上一样. 有以下几点需要您注意 您可以在 web GUI 触发 build,通过点击 "Actions" -> "Build",或用 'curl' 命令发送一个请求到 Kylin RESTful API: {% highlight Groff markup %} -curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 {% endhighlight %} 请注意 API 终端和普通 cube 不一样 (这个 URL 以 "build2" 结尾)。 @@ -139,7 +139,7 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" ## 点击 "Insight" 标签,编写 SQL 运行,例如: {% highlight Groff markup %} -select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_table group by minute_start order by minute_start +select minute_start, count(*), sum(amount), sum(qty) from streaming_sales_table group by minute_start order by minute_start {% endhighlight %} 结果如下。 @@ -152,7 +152,7 @@ select minute_start,count(*),sum(amount),sum(qty) from streaming_sales_tab {% highlight Groff markup %} crontab -e -*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 +*/5 * * * * curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/build2 {% endhighlight %} 现在您可以观看 cube 从 streaming 中自动 built。当 cube segments 累积到更大的时间范围,Kylin 将会自动的将其合并到一个更大的 segment 中。 @@ -202,18 +202,18 @@ Caused by: java.lang.ClassNotFoundException: org.apache.kafka.clients.producer.P * 如果 Kafka 里已经有一组历史 message 且您不想从最开始 build,您可以触发一个调用来将当前的结束位置设为 cube 的开始: {% highlight Groff markup %} -curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0,"sourceOffsetEnd": 9223372036854775807,"buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" -d '{ "sourceOffsetStart": 0, "sourceOffsetEnd": 9223372036854775807, "buildType": "BUILD"}' http://localhost:7070/kylin/api/cubes/{your_cube_name}/init_start_offsets {% endhighlight %} * 如果一些 build job 出错了并且您将其 discard,Cube 中就会留有一个洞(或称为空隙)。每一次 Kylin 都会从最后的位置 build,您不可期望通过正常的 builds 将洞填补。Kylin 提供了 API 检查和填补洞 检查洞: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} 如果查询结果是一个空的数组,意味着没有洞;否则,触发 Kylin 填补他们: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} diff --git a/website/_docs23/tutorial/cube_streaming.md b/website/_docs23/tutorial/cube_streaming.md index 6b42d0f..4661511 100644 --- a/website/_docs23/tutorial/cube_streaming.md +++ b/website/_docs23/tutorial/cube_streaming.md @@ -14,7 +14,7 @@ In this tutorial, we will use Hortonworks HDP 2.2.4 Sandbox VM + Kafka v0.10.0(S ## Install Kafka 0.10.0.0 and Kylin Don't use HDP 2.2.4's build-in Kafka as it is too old, stop it first if it is running. {% highlight Groff markup %} -curl -s http://mirrors.tuna.tsinghua.edu.cn/apache/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ +curl -s https://archive.apache.org/dist/kafka/0.10.0.0/kafka_2.10-0.10.0.0.tgz | tar -xz -C /usr/local/ cd /usr/local/kafka_2.10-0.10.0.0/ @@ -209,11 +209,11 @@ curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" Check holes: {% highlight Groff markup %} -curl -X GET --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X GET --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %} If the result is an empty arrary, means there is no hole; Otherwise, trigger Kylin to fill them: {% highlight Groff markup %} -curl -X PUT --user ADMINN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes +curl -X PUT --user ADMIN:KYLIN -H "Content-Type: application/json;charset=utf-8" http://localhost:7070/kylin/api/cubes/{your_cube_name}/holes {% endhighlight %}
