天天看点

Spark集群安装与启动

整个安装过程参考博文:

https://blog.csdn.net/JavaMoo/article/details/77175579

遇到的问题JAVA_HOME is not set ,参考博文:

https://blog.csdn.net/u014052851/article/details/76549451

(一)启动Hadoop

hadoop启动:只需要再hserver1(namenode)上执行启动命令即可

cd /opt/hadoop/hadoop-2.8.0/sbin

./start-all.sh

http://192.168.234.132:50070/

(二)启动spark

spark启动:在hadoop正常运行的情况下,在hserver1(也就是hadoop的namenode,spark的marster节点)上执行命令

cd /opt/spark/spark-2.1.1-bin-hadoop2.7/sbin

./start-all.sh

http://192.168.234.132:8080/

Spark集群安装与启动

继续阅读