1. 程式人生 > >【Hbase異常】windows 中使用hbase 異常:java.io.IOException: Could not locate executable null\bin\winutils.exe

【Hbase異常】windows 中使用hbase 異常:java.io.IOException: Could not locate executable null\bin\winutils.exe

平時一般是在windows環境下進行開發,在windows 環境下操作hbase可能會出現異常(java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries.),以前也遇到過這個問題,今天又有小夥伴遇到這個問題,就順帶記一筆,異常資訊如下:

2016-05-23 17:02:13,551 WARN [org.apache.hadoop.util.NativeCodeLoader] - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
2016
-05-23 17:02:13,611 ERROR [org.apache.hadoop.util.Shell] - Failed to locate the winutils binary in the hadoop binary path java.io.IOException: Could not locate executable null\bin\winutils.exe in the Hadoop binaries. at org.apache.hadoop.util.Shell.getQualifiedBinPath(Shell.java:278) at org.apache
.hadoop.util.Shell.getWinUtilsPath(Shell.java:300) at org.apache.hadoop.util.Shell.<clinit>(Shell.java:293) at org.apache.hadoop.util.StringUtils.<clinit>(StringUtils.java:76) at org.apache.hadoop.conf.Configuration.getStrings(Configuration.java:1514) at org.apache.hadoop.hbase
.zookeeper.ZKConfig.makeZKProps(ZKConfig.java:113) at org.apache.hadoop.hbase.zookeeper.ZKConfig.getZKQuorumServersString(ZKConfig.java:265) at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:159) at org.apache.hadoop.hbase.zookeeper.ZooKeeperWatcher.<init>(ZooKeeperWatcher.java:134) at org.apache.hadoop.hbase.client.ZooKeeperKeepAliveConnection.<init>(ZooKeeperKeepAliveConnection.java:43) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.getKeepAliveZooKeeperWatcher(HConnectionManager.java:1710) at org.apache.hadoop.hbase.client.ZooKeeperRegistry.getClusterId(ZooKeeperRegistry.java:82) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.retrieveClusterId(HConnectionManager.java:806) at org.apache.hadoop.hbase.client.HConnectionManager$HConnectionImplementation.<init>(HConnectionManager.java:633) at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:526) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:387) at org.apache.hadoop.hbase.client.HConnectionManager.createConnection(HConnectionManager.java:282) at net.shgaoxin.db.hbase.HbaseConnectionFactory.createResource(HbaseConnectionFactory.java:67) at net.shgaoxin.db.hbase.HbaseConnectionFactory.makeObject(HbaseConnectionFactory.java:40) at org.apache.commons.pool2.impl.GenericObjectPool.create(GenericObjectPool.java:868) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:435) at org.apache.commons.pool2.impl.GenericObjectPool.borrowObject(GenericObjectPool.java:363) at net.shgaoxin.base.AbstractPooledContainer.get(AbstractPooledContainer.java:49) at net.shgaoxin.db.hbase.HbaseConnectionContainer.getConnection(HbaseConnectionContainer.java:46) at net.shgaoxin.db.hbase.HbaseConnectionContainer.getConnection(HbaseConnectionContainer.java:14) at net.shgaoxin.db.hbase.HbaseTemplate.scan(HbaseTemplate.java:398) at net.shgaoxin.impl.dao.hbase.GenericDaoHbaseImpl.scan(GenericDaoHbaseImpl.java:73) at net.shgaoxin.impl.service.eastdayminisitesp.ImgUploadServiceImpl.getCurrentStepRowkeys(ImgUploadServiceImpl.java:260) at net.shgaoxin.impl.context.eastdayminisitesp.AsyncImgUploadContextImpl.doOnStart(AsyncImgUploadContextImpl.java:81) at net.shgaoxin.impl.context.AbstractProcessQueue.start(AbstractProcessQueue.java:119) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606)

檢視Hadoop原始碼:

/** fully qualify the path to a binary that should be in a known hadoop 
   *  bin location. This is primarily useful for disambiguating call-outs 
   *  to executable sub-components of Hadoop to avoid clashes with other 
   *  executables that may be in the path.  Caveat:  this call doesn't 
   *  just format the path to the bin directory.  It also checks for file 
   *  existence of the composed path. The output of this call should be 
   *  cached by callers.
   * */
  public static final String getQualifiedBinPath(String executable) 
  throws IOException {
    // construct hadoop bin path to the specified executable
    String fullExeName = HADOOP_HOME_DIR + File.separator + "bin" 
      + File.separator + executable;

    File exeFile = new File(fullExeName);
    if (!exeFile.exists()) {
      throw new IOException("Could not locate executable " + fullExeName
        + " in the Hadoop binaries.");
    }

    return exeFile.getCanonicalPath();
  }


  private static String HADOOP_HOME_DIR = checkHadoopHome();

 /** Centralized logic to discover and validate the sanity of the Hadoop 
   *  home directory. Returns either NULL or a directory that exists and 
   *  was specified via either -Dhadoop.home.dir or the HADOOP_HOME ENV 
   *  variable.  This does a lot of work so it should only be called 
   *  privately for initialization once per process.
   **/
  private static String checkHadoopHome() {

    // first check the Dflag hadoop.home.dir with JVM scope
    String home = System.getProperty("hadoop.home.dir");

    // fall back to the system/user-global env variable
    if (home == null) {
      home = System.getenv("HADOOP_HOME");
    }

    try {
       // couldn't find either setting for hadoop's home directory
       if (home == null) {
         throw new IOException("HADOOP_HOME or hadoop.home.dir are not set.");
       }

       if (home.startsWith("\"") && home.endsWith("\"")) {
         home = home.substring(1, home.length()-1);
       }

       // check that the home setting is actually a directory that exists
       File homedir = new File(home);
       if (!homedir.isAbsolute() || !homedir.exists() || !homedir.isDirectory()) {
         throw new IOException("Hadoop home directory " + homedir
           + " does not exist, is not a directory, or is not an absolute path.");
       }

       home = homedir.getCanonicalPath();

    } catch (IOException ioe) {
      if (LOG.isDebugEnabled()) {
        LOG.debug("Failed to detect a valid hadoop home directory", ioe);
      }
      home = null;
    }

    return home;
  }

結合異常不難發現HADOOP_HOME_DIR值為null,基本上可以判斷是 HADOOP_HOME環境變數的問題,實際上本來就沒有配置hadoop的環境變數,報錯也是理所當然了。

環境變數配置:

這裡寫圖片描述

需要重啟機器才生效。