Skip to content

Commit 8c62cf1

Browse files
author
astroshim
committed
fixed felixcheung pointed out.
1 parent 633c930 commit 8c62cf1

File tree

6 files changed

+11
-14
lines changed

6 files changed

+11
-14
lines changed

docs/_includes/themes/zeppelin/_navigation.html

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -105,7 +105,7 @@
105105
<li class="title"><span><b>Advanced</b><span></li>
106106
<li><a href="{{BASE_PATH}}/install/virtual_machine.html">Zeppelin on Vagrant VM</a></li>
107107
<li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-standalone-mode">Zeppelin on Spark Cluster Mode (Standalone)</a></li>
108-
<li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-standalone-mode">Zeppelin on Spark Cluster Mode (Yarn)</a></li>
108+
<li><a href="{{BASE_PATH}}/install/spark_cluster_mode.html#spark-standalone-mode">Zeppelin on Spark Cluster Mode (YARN)</a></li>
109109
<li role="separator" class="divider"></li>
110110
<li class="title"><span><b>Contibute</b><span></li>
111111
<li><a href="{{BASE_PATH}}/development/writingzeppelininterpreter.html">Writing Zeppelin Interpreter</a></li>

docs/index.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@ Join to our [Mailing list](https://zeppelin.apache.org/community.html) and repor
170170
* Advanced
171171
* [Apache Zeppelin on Vagrant VM](./install/virtual_machine.html)
172172
* [Zeppelin on Spark Cluster Mode (Standalone via Docker)](./install/spark_cluster_mode.html#spark-standalone-mode)
173-
* [Zeppelin on Spark Cluster Mode (Yarn via Docker)](./install/spark_cluster_mode.html#spark-yarn-mode)
173+
* [Zeppelin on Spark Cluster Mode (YARN via Docker)](./install/spark_cluster_mode.html#spark-yarn-mode)
174174
* Contribute
175175
* [Writing Zeppelin Interpreter](./development/writingzeppelininterpreter.html)
176176
* [Writing Zeppelin Application (Experimental)](./development/writingzeppelinapplication.html)

docs/install/spark_cluster_mode.md

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -72,8 +72,8 @@ ps -ef | grep spark
7272
```
7373

7474

75-
## Spark on Yarn mode
76-
You can simply set up [Spark on Yarn](http://spark.apache.org/docs/latest/running-on-yarn.html) docker environment with below steps.
75+
## Spark on YARN mode
76+
You can simply set up [Spark on YARN](http://spark.apache.org/docs/latest/running-on-yarn.html) docker environment with below steps.
7777

7878
> **Note :** Since Apache Zeppelin and Spark use same `8080` port for their web UI, you might need to change `zeppelin.server.port` in `conf/zeppelin-site.xml`.
7979
@@ -111,9 +111,9 @@ docker run -it \
111111
spark_yarn bash;
112112
```
113113

114-
### 3. Verify running Spark on Yarn.
114+
### 3. Verify running Spark on YARN.
115115

116-
You can simply verify the processes of Spark and Yarn is running well in Docker with below command.
116+
You can simply verify the processes of Spark and YARN is running well in Docker with below command.
117117

118118

119119
```

scripts/docker/spark-cluster-managers/spark_standalone/Dockerfile

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,7 +13,6 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515
FROM centos:centos6
16-
1716

1817
ENV SPARK_PROFILE 1.6
1918
ENV SPARK_VERSION 1.6.2

scripts/docker/spark-cluster-managers/spark_yarn_cluster/Dockerfile

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,10 +13,9 @@
1313
# See the License for the specific language governing permissions and
1414
# limitations under the License.
1515
FROM centos:centos6
16-
1716

18-
ENV SPARK_PROFILE 1.6
19-
ENV SPARK_VERSION 1.6.1
17+
ENV SPARK_PROFILE 2.0
18+
ENV SPARK_VERSION 2.0.0
2019
ENV HADOOP_PROFILE 2.3
2120
ENV HADOOP_VERSION 2.3.0
2221

@@ -73,7 +72,6 @@ RUN rm /usr/local/hadoop/lib/native/*
7372
RUN curl -Ls http://dl.bintray.com/sequenceiq/sequenceiq-bin/hadoop-native-64.tar|tar -x -C /usr/local/hadoop/lib/native/
7473

7574
# install spark
76-
#RUN curl -s https://www.apache.org/dist/spark/spark-1.6.1/spark-1.6.1-bin-hadoop2.3.tgz | tar -xz -C /usr/local/
7775
RUN curl -s http://archive.apache.org/dist/spark/spark-$SPARK_VERSION/spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE.tgz | tar -xz -C /usr/local/
7876
RUN cd /usr/local && ln -s spark-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE spark
7977
ENV SPARK_HOME /usr/local/spark
@@ -106,4 +104,4 @@ EXPOSE 8030 8031 8032 8033 8040 8042 8088
106104
#spark
107105
EXPOSE 8080 7077 8888 8081
108106

109-
ENTRYPOINT ["/etc/entrypoint.sh"]
107+
ENTRYPOINT ["/etc/entrypoint.sh"]

scripts/docker/spark-cluster-managers/spark_yarn_cluster/entrypoint.sh

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@ service sshd start
3232
$HADOOP_PREFIX/sbin/start-dfs.sh
3333
$HADOOP_PREFIX/sbin/start-yarn.sh
3434

35-
$HADOOP_PREFIX/bin/hdfs dfsadmin -safemode leave && $HADOOP_PREFIX/bin/hdfs dfs -put $SPARK_HOME-1.6.1-bin-hadoop2.3/lib /spark
35+
$HADOOP_PREFIX/bin/hdfs dfsadmin -safemode leave && $HADOOP_PREFIX/bin/hdfs dfs -put $SPARK_HOME-$SPARK_VERSION-bin-hadoop$HADOOP_PROFILE/lib /spark
3636

3737
# start spark
3838
export SPARK_MASTER_OPTS="-Dspark.driver.port=7001 -Dspark.fileserver.port=7002
@@ -57,4 +57,4 @@ then
5757
/usr/sbin/sshd -D -d
5858
else
5959
/bin/bash -c "$*"
60-
fi
60+
fi

0 commit comments

Comments
 (0)