整个安装过程参考博文:
https://blog.csdn.net/JavaMoo/article/details/77175579
遇到的问题JAVA_HOME is not set ,参考博文:
https://blog.csdn.net/u014052851/article/details/76549451
(一)启动Hadoop
hadoop启动:只需要再hserver1(namenode)上执行启动命令即可
cd /opt/hadoop/hadoop-2.8.0/sbin
./start-all.sh
http://192.168.234.132:50070/
(二)启动spark
spark启动:在hadoop正常运行的情况下,在hserver1(也就是hadoop的namenode,spark的marster节点)上执行命令
cd /opt/spark/spark-2.1.1-bin-hadoop2.7/sbin
./start-all.sh
http://192.168.234.132:8080/
![](https://img.laitimes.com/img/9ZDMuAjOiMmIsIjOiQnIsICM38CXlZHbvN3cpR2Lc1TPB10QGtWUCpEMJ9CXsxWam9CXwADNvwVZ6l2c052bm9CXUJDT1wkNhVzLcRnbvZ2LcJTWq5EbShVYzZlMMBjVtJWd0ckW65UbM5WOHJWa5kHT20ESjBjUIF2LcRHelR3LcJzLctmch1mclRXY39jN2MTO1ATN4ATOxgDM4EDMy8CX0Vmbu4GZzNmLn9Gbi1yZtl2Lc9CX6MHc0RHaiojIsJye.jpg)