天天看點

Hadoop+Spark Windows系統環境搭建

Hadoop 環境變量配置:

HADOOP_HOME:D:\ProgramData\BigData\Hadoop\hadoop-2.7.4

Path:%HADOOP_HOME%\bin

Spark 環境變量配置:

SPARK_HOME:D:\ProgramData\BigData\Spark\spark-2.2.0

Path:%SPARK_HOME%\bin

修改hadoop配置檔案:

[1].編輯D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\etc\hadoop,下的core-site.xml檔案:

    <configuration>

      <property>

            <name>hadoop.tmp.dir</name>

            <value>/D:/ProgramData/BigData/Hadoop/hadoop-2.7.4/workplace/tmp</value>

        </property>

        <property>

            <name>dfs.name.dir</name>

            <value>/D:/ProgramData/BigData/Hadoop/hadoop-2.7.4/workplace/name</value>

        </property>

        <property>

            <name>fs.default.name</name>

            <value>hdfs://localhost:8084/Hadoop</value>

        </property>

    </configuration>

[2].編輯“D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\etc\hadoop”目錄下的mapred-site.xml(沒有就将mapred-site.xml.template重命名為mapred-site.xml)檔案

    <configuration>

        <property>

           <name>mapreduce.framework.name</name>

           <value>yarn</value>

        </property>

        <property>

           <name>mapred.job.tracker</name>

           <value>hdfs://localhost:8085/Hadoop</value>

        </property>

    </configuration>

[3].編輯“D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\etc\hadoop”目錄下的hdfs-site.xml檔案:

    <configuration>

    <!-- 這個參數設定為1,因為是單機版hadoop -->

        <property>

            <name>dfs.replication</name>

            <value>1</value>

        </property>

        <property>

            <name>dfs.data.dir</name>

            <value>/D:/ProgramData/BigData/Hadoop/hadoop-2.7.4/workplace/data</value>

        </property>

    </configuration>

[4].編輯“D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\etc\hadoop”目錄下的yarn-site.xml檔案:

  <configuration>

       <property>

           <name>yarn.nodemanager.aux-services</name>

           <value>mapreduce_shuffle</value>

        </property>

        <property>

           <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>

           <value>org.apache.hadoop.mapred.ShuffleHandler</value>

        </property>

    </configuration>

[5].編輯“D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\etc\hadoop”目錄下的hadoop-env.cmd檔案,将JAVA_HOME用 @rem注釋掉,編輯為JAVA_HOME的路徑:

set JAVA_HOME=D:/ProgramLanguage/Java/jdk/jdk1.8.0_144

[6].下載下傳到的hadooponwindows-master.zip,解壓,将bin目錄(包含以下.dll和.exe檔案)檔案替換原來hadoop目錄下的bin目錄

[7].

hadoop fs -mkdir hdfs://localhost:8087/user

hadoop fs -mkdir hdfs://localhost:8087/user/wcinput

hadoop fs -put D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\workplace\data\file1.txt hdfs://localhost:8087/user/wcinput

hadoop fs -put D:\ProgramData\BigData\Hadoop\hadoop-2.7.4\workplace\data\file2.txt hdfs://localhost:8087/user/wcinput

hadoop fs -ls hdfs://localhost:8087/user/wcinput

繼續閱讀