1. 程式人生 > >CentOS 裝hadoop3.0.3 版本踩坑

CentOS 裝hadoop3.0.3 版本踩坑

java_home sbin mission authorize wan could not secondary 解決 進入

  1、but there is no HDFS_NAMENODE_USER defined. Aborting operation.
  
  [root@xcff sbin]# ./start-dfs.sh
  
  Starting namenodes on [localhost]
  
  ERROR: Attempting to operate on hdfs namenode as root
  
  ERROR: but there is no HDFS_NAMENODE_USER defined. Aborting operation.
  
  Starting datanodes
  
  ERROR: Attempting to operate on hdfs datanode as root
  
  ERROR: but there is no HDFS_DATANODE_USER defined. Aborting operation.
  
  Starting secondary namenodes [localhost]
  
  ERROR: Attempting to operate on hdfs secondarynamenode as root
  
  ERROR: but there is no HDFS_SECONDARYNAMENODE_USER defined. Aborting operatio
  
  然後就暈了, 解決方法:
  
  在/hadoop/hadoop-3.0.3/sbin/start-dfs.sh中:
  
  HDFS_DATANODE_USER=root
  
  HADOOP_SECURE_DN_USER=hdfs
  
  HDFS_NAMENODE_USER=root
  
  HDFS_SECONDARYNAMENODE_USER=root
  
  在/hadoop/hadoop-3.0.3/sbin/start-yarn.sh中:
  
  HDFS_DATANODE_USER=root
  
  HADOOP_SECURE_DN_USER=hdfs
  
  HDFS_NAMENODE_USER=root
  
  HDFS_SECONDARYNAMENODE_USER=root
  
  在/hadoop/hadoop-3.0.3/sbin/stop-dfs.sh中:
  
  HDFS_DATANODE_USER=root
  
  HADOOP_SECURE_DN_USER=hdfs
  
  HDFS_NAMENODE_USER=root
  
  HDFS_SECONDARYNAMENODE_USER=root
  
  在/hadoop/hadoop-3.0.3/sbin/stop-yarn.sh中:
  
  HDFS_DATANODE_USER=root
  
  HADOOP_SECURE_DN_USER=hdfs
  
  HDFS_NAMENODE_USER=root
  
  HDFS_SECONDARYNAMENODE_USER=root
  
  2、localhost: Permission denied (publickey,www.meiwanyule.cn/ gssapi-keyex,gssapi-with-mic,password).
  
  這個問題 是需要免密登陸
  
  [root@xcff sbin]# ./start-dfs.sh
  
  WARNING: HADOOP_SECURE_DN_USER www.qinlinyu.cn/ has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER.
  
  Starting namenodes on [localhost]
  
  上一次登錄:一 1月 7 15:19:55 CST 2019從 192.168.101.18pts/8 上
  
  localhost: Permission www.tongqt178.com denied (publickey,gssapi-keyex,gssapi-with-mic,password).
  
  Starting datanodes
  
  上一次登錄:一 1月 7 15:34:53 CST 2019pts/0 上
  
  localhost: Permission denied (publickey,gssapi-keyex,gssapi-with-mic,password).
  
  Starting secondary namenodes [localhost]
  
  上一次登錄:一 1月 7 15:34:53 CST 2019pts/0 上
  
  localhost: Permission denied (www.gouyiflb.cn publickey,gssapi-keyex,gssapi-with-mic,password).
  
  解決方法是:
  
  ssh-keygen -t rsa -P ""
  
  最後的樣子是:
  
  最後進入的數據的目錄中: /root/.ssh/id_rsa
  
  在~目錄的.ssh下生成秘鑰
  
  將生成的公鑰id_rsa.pub 內容追加到authorized_keys
  
  使用的命令是(需要進入到這個目錄中才行):
  
  cat id_rsa.pub >> authorized_keys
  
  3、localhost: ERROR: www.michenggw.com JAVA_HOME is not set and could not be found.
  
  但是有JAVA_HOME 環境變量啊, 為啥不行呢?
  
  [root@xcff sbin]# ./start-dfs.sh
  
  WARNING: HADOOP_SECURE_DN_USER has been replaced by HDFS_DATANODE_SECURE_USER. Using value of HADOOP_SECURE_DN_USER.
  
  Starting namenodes on [localhost]
  
  上一次登錄:一 1月 7 15:43:15 CST 2019pts/0 上
  
  localhost: ERROR: JAVA_HOME is not set and could not be found.
  
  Starting datanodes
  
  上一次登錄:一 1月 7 15:45:53 CST 2019pts/0 上
  
  localhost: ERROR: JAVA_HOME is not set and could not be found.
  
  Starting secondary namenodes [localhost]
  
  上一次登錄:一 1月 7 15:45:54 CST 2019pts/0 上
  
  localhost: ERROR: JAVA_HOME is not set and could not be found.
  
  修改/hadoop/hadoop-3.0.3/etc/hadoop
  
  添加環境變量
  
  打開註釋, 最後, 在後面加上 echo $JAVA_HOME顯示的目錄, 保存

CentOS 裝hadoop3.0.3 版本踩坑