1. 程式人生 > >hadoop在CentOS下的安裝配置

hadoop在CentOS下的安裝配置

關閉 ssh 集群配置 cat ati shuffle manage map core

  • 版本:CentOS-6.8-x86_64-minimal,hadoop2.6.4,jdk1.7.0
  • 首先把jdk、hadoop壓縮包下載發送到CentOS下並解壓

下載發送不多贅述,解壓命令tar -zxvf 壓縮包

mv 原文件名 新文件名

(註意空格)

  • 先配置jdk
  1. 進入jdk目錄 pwd 查看當前目錄復制備用/apps/jdk1.7.0_25
  2. 配置環境變量

vi ~/.bash_profile

JAVA_HOME=/apps/jdk1.7.0_25
PATH=$PATH:$HOME/bin:$JAVA_HOME/bin

source ~/.bash_profile

  • 關閉防火墻,做好ssh免密登錄
    • 關閉防火墻
      • service iptables stop
      • chkconfig iptables off
      • ssh-keygen -t rsa
    • ssh免密登錄
      • master: cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
      • master:scp /root/.ssh/authorized_keys @slave1:/root/.ssh/authorized_keys
      • slave1:cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
      • slave1:scp /root/.ssh/authorized_keys @slave2:/root/.ssh/authorized_keys
      • slave2:cat /root/.ssh/id_rsa.pub >>/root/.ssh/authorized_keys
      • slave2:scp /root/.ssh/authorized_keys @master:/root/.ssh/authorized_keys
      • master:scp /root/.ssh/authorized_keys @slave1:/root/.ssh/authorized_keys
      • slave2:scp /root/.ssh/authorized_keys @slave2:/root/.ssh/authorized_keys
  • 再配置hadoop
  1. 環境變量vi ~/.bash_profile

    HADOOP_HOME=/apps/hadoop-2.6.4
    PATH=$PATH:$HOME/bin:$JAVA_HOME/bin:$HADOOP_HOME/bin:$HADOOP_HOME/sbin

    source ~/.bash_profile

  2. 配置運行環境
    • core-site.xml
      • <property>
        <name>fs.defaultFS</name>
        <value>hdfs://master:9000</value>
        </property>

    • hdfs-site.xml

      • <property>
        <name>dfs.replication</name>
        <value>3</value>
        </property>

        <property>
        <name>dfs.namenode.name.dir</name>
        <value>/app/hadoop/dfs/name</value>
        </property>

        <property>
        <name>dfs.datanode.data.dir</name>
        <value>/app/hadoop/dfs/data</value>
        </property>

        <property>
        <name>dfs.secondary.http.address</name>
        <value>slave2:50090</value>
        </property>

        <property>
        <name>dfs.namenode.checkpoint.dir</name>
        <value>/app/hadoop/dfs/namesecondary</value>
        </property>

    • hadoop-env.sh
      • export JAVA_HOME=/apps/jdk1.7.0_25
    • yarn-site.xml
      • <property>
        <name>yarn.nodemanager.aux-services</name>
        <value>mapreduce_shuffle</value>
        </property>

        <property>
        <name>yarn.resourcemanager.hostname</name>
        <value>master</value>
        </property>


        mapreduce-site.xml
        <property>
        <name>mapreduce.framework.name</name>
        <value>yarn</value>
        </property>

  3. 將配置完成的jdk、hadoop發送到另外兩臺主機上
    • master:scp -r apps/ @slave1:/apps/
    • master:scp -r apps/ @slave2:/apps/

  hadoop集群配置完成

hadoop在CentOS下的安裝配置