1. 程式人生 > >解決DataNode不能全部啟動問題 org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block

解決DataNode不能全部啟動問題 org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block

問題描述:
FATAL org.apache.hadoop.hdfs.server.datanode.DataNode: Initialization failed for Block pool <registering> (Datanode Uuid unassigned) service to master/192.168.235.129:8020. Exiting.
java.io.IOException: All specified directories are failed to load.
at org.apache.hadoop.hdfs.server.datanode.DataStorage.recoverTransitionRead(DataStorage.java:478)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initStorage(DataNode.java:1338)
at org.apache.hadoop.hdfs.server.datanode.DataNode.initBlockPool(DataNode.java:1304)
at org.apache.hadoop.hdfs.server.datanode.BPOfferService.verifyAndSetNamespaceInfo(BPOfferService.java:314)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.connectToNNAndHandshake(BPServiceActor.java:226)
at org.apache.hadoop.hdfs.server.datanode.BPServiceActor.run(BPServiceActor.java:867)
at java.lang.Thread.run(Thread.java:748)
處理辦法:
1.停止叢集。stop-dfs.sh
2.刪除dfs.namenode.name.dir和dfs.datanode.data.dir 目錄下的所有檔案
3.重新格式化:bin/hadoop namenode -format
4.啟動。start-dfs.sh