1. 程式人生 > >windows下安裝並啟動hadoop2.7.2

windows下安裝並啟動hadoop2.7.2

64位windows安裝hadoop沒必要倒騰Cygwin,直接解壓官網下載hadoop安裝包到本地->最小化配置4個基本檔案->執行1條啟動命令->完事。一個前提是你的電腦上已經安裝了jdk,設定了java環境變數。下面把這幾步細化貼出來,以hadoop2.7.2為例

1、下載hadoop安裝包就不細說了:http://hadoop.apache.org/->左邊點Releases->點mirror site->點http://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common->下載hadoop-2.7.2.tar.gz;

2、解壓也不細說了:複製到D盤根目錄直接解壓,出來一個目錄D:\hadoop-2.7.2,配置到環境變數HADOOP_HOME中,在PATH里加上%HADOOP_HOME%\bin;點選http://download.csdn.net/detail/wuxun1997/9841472下載相關工具類,直接解壓後把檔案丟到D:\hadoop-2.7.2\bin目錄中去,將其中的hadoop.dll在c:/windows/System32下也丟一份;

3、去D:\hadoop-2.7.2\etc\hadoop找到下面4個檔案並按如下最小配置貼上上去:

core-site.xml

<configuration>
    <property>
        <name>fs.defaultFS</name>
        <value>hdfs://localhost:9000</value>
    </property>    
</configuration>

hdfs-site.xml

<configuration>
    <property>
        <name>dfs.replication</name>
        <value>1</value>
    </property>
    <property>    
        <name>dfs.namenode.name.dir</name>    
        <value>file:/hadoop/data/dfs/namenode</value>    
    </property>    
    <property>    
        <name>dfs.datanode.data.dir</name>    
        <value>file:/hadoop/data/dfs/datanode</value>  
    </property>
</configuration>

mapred-site.xml

<configuration>
    <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
    </property>
</configuration>

yarn-site.xml

<configuration>
    <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
    </property>
    <property>
        <name>yarn.nodemanager.aux-services.mapreduce.shuffle.class</name>
        <value>org.apache.hadoop.mapred.ShuffleHandler</value>
    </property>
</configuration>

4、啟動windows命令列視窗,進入hadoop-2.7.2\bin目錄,執行下面2條命令,先格式化namenode再啟動hadoop

D:\hadoop-2.7.2\bin>hadoop namenode -format
D:\hadoop-2.7.2\bin>cd ..\sbin

D:\hadoop-2.7.2\sbin>start-all.cmd
This script is Deprecated. Instead use start-dfs.cmd and start-yarn.cmd
starting yarn daemons

D:\hadoop-2.7.2\sbin>jps

通過jps命令可以看到4個程序都拉起來了,到這裡hadoop的安裝啟動已經完事了。接著我們可以用瀏覽器到localhost:8088看mapreduce任務,到localhost:50070->Utilites->Browse the file system看hdfs檔案。如果重啟hadoop無需再格式化namenode,只要stop-all.cmd再start-all.cmd就可以了。