Repository: kylin
Updated Branches:
  refs/heads/document 01a87a4de -> 07bac894f


remove SNAPSHOT from the spark cubing document


Project: http://git-wip-us.apache.org/repos/asf/kylin/repo
Commit: http://git-wip-us.apache.org/repos/asf/kylin/commit/07bac894
Tree: http://git-wip-us.apache.org/repos/asf/kylin/tree/07bac894
Diff: http://git-wip-us.apache.org/repos/asf/kylin/diff/07bac894

Branch: refs/heads/document
Commit: 07bac894f6eb21b14a7ea9245827fcbbc48d456b
Parents: 01a87a4
Author: shaofengshi <shaofeng...@apache.org>
Authored: Thu Jul 27 15:14:21 2017 +0800
Committer: shaofengshi <shaofeng...@apache.org>
Committed: Thu Jul 27 15:14:21 2017 +0800

----------------------------------------------------------------------
 website/_docs20/tutorial/cube_spark.md | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/kylin/blob/07bac894/website/_docs20/tutorial/cube_spark.md
----------------------------------------------------------------------
diff --git a/website/_docs20/tutorial/cube_spark.md 
b/website/_docs20/tutorial/cube_spark.md
index 175f87c..c34afa8 100644
--- a/website/_docs20/tutorial/cube_spark.md
+++ b/website/_docs20/tutorial/cube_spark.md
@@ -41,7 +41,7 @@ vi $KYLIN_HOME/hadoop-conf/hive-site.xml (change 
"hive.execution.engine" value f
 Now, let Kylin know this directory with property "kylin.env.hadoop-conf-dir" 
in kylin.properties:
 
 {% highlight Groff markup %}
-kylin.env.hadoop-conf-dir=/usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/hadoop-conf
+kylin.env.hadoop-conf-dir=/usr/local/apache-kylin-2.0.0-bin/hadoop-conf
 {% endhighlight %}
 
 If this property isn't set, Kylin will use the directory that "hive-site.xml" 
locates in; while that folder may have no "hbase-site.xml", will get HBase/ZK 
connection error in Spark.
@@ -140,7 +140,7 @@ After all steps be successfully executed, the Cube becomes 
"Ready" and you can q
 When getting error, you should check "logs/kylin.log" firstly. There has the 
full Spark command that Kylin executes, e.g:
 
 {% highlight Groff markup %}
-2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] 
spark.SparkExecutable:121 : cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/hadoop-conf && 
/usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/spark/bin/spark-submit --class 
org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  
--conf 
spark.yarn.jar=hdfs://sandbox.hortonworks.com:8020/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar
  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml
  --jars 
/usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/spark/lib/spark-assembly-1.6.3-hadoop2.6.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,
 /usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/lib/kylin-job-2.0.0-SNAPSHOT.jar 
-className org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.0.0-SNAPSHOT-bin/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube
+2017-03-06 14:44:38,574 INFO  [Job 2d5c1178-c6f6-4b50-8937-8e5e3b39227e-306] 
spark.SparkExecutable:121 : cmd:export 
HADOOP_CONF_DIR=/usr/local/apache-kylin-2.0.0-bin/hadoop-conf && 
/usr/local/apache-kylin-2.0.0-bin/spark/bin/spark-submit --class 
org.apache.kylin.common.util.SparkEntry  --conf spark.executor.instances=1  
--conf 
spark.yarn.jar=hdfs://sandbox.hortonworks.com:8020/kylin/spark/spark-assembly-1.6.3-hadoop2.6.0.jar
  --conf spark.yarn.queue=default  --conf 
spark.yarn.am.extraJavaOptions=-Dhdp.version=current  --conf 
spark.history.fs.logDirectory=hdfs:///kylin/spark-history  --conf 
spark.driver.extraJavaOptions=-Dhdp.version=current  --conf spark.master=yarn  
--conf spark.executor.extraJavaOptions=-Dhdp.version=current  --conf 
spark.executor.memory=1G  --conf spark.eventLog.enabled=true  --conf 
spark.eventLog.dir=hdfs:///kylin/spark-history  --conf spark.executor.cores=2  
--conf spark.submit.deployMode=cluster --files 
/etc/hbase/2.4.0.0-169/0/hbase-site.xml --jars /usr/local
 
/apache-kylin-2.0.0-bin/spark/lib/spark-assembly-1.6.3-hadoop2.6.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/htrace-core-3.1.0-incubating.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-client-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-common-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/hbase-protocol-1.1.2.2.4.0.0-169.jar,/usr/hdp/2.4.0.0-169/hbase/lib/metrics-core-2.2.0.jar,/usr/hdp/2.4.0.0-169/hbase/lib/guava-12.0.1.jar,
 /usr/local/apache-kylin-2.0.0-bin/lib/kylin-job-2.0.0.jar -className 
org.apache.kylin.engine.spark.SparkCubingByLayer -hiveTable 
kylin_intermediate_kylin_sales_cube_555c4d32_40bb_457d_909a_1bb017bf2d9e 
-segmentId 555c4d32-40bb-457d-909a-1bb017bf2d9e -confPath 
/usr/local/apache-kylin-2.0.0-bin/conf -output 
hdfs:///kylin/kylin_metadata/kylin-2d5c1178-c6f6-4b50-8937-8e5e3b39227e/kylin_sales_cube/cuboid/
 -cubename kylin_sales_cube
 
 {% endhighlight %}
 

Reply via email to