天天看點

Spark叢集安裝與啟動

整個安裝過程參考博文:

https://blog.csdn.net/JavaMoo/article/details/77175579

遇到的問題JAVA_HOME is not set ,參考博文:

https://blog.csdn.net/u014052851/article/details/76549451

(一)啟動Hadoop

hadoop啟動:隻需要再hserver1(namenode)上執行啟動指令即可

cd /opt/hadoop/hadoop-2.8.0/sbin

./start-all.sh

http://192.168.234.132:50070/

(二)啟動spark

spark啟動:在hadoop正常運作的情況下,在hserver1(也就是hadoop的namenode,spark的marster節點)上執行指令

cd /opt/spark/spark-2.1.1-bin-hadoop2.7/sbin

./start-all.sh

http://192.168.234.132:8080/

Spark叢集安裝與啟動

繼續閱讀