1. 程式人生 > >Hadoop除錯:Exception in thread "main"java.lang.UnsatisfiedLinkError

Hadoop除錯:Exception in thread "main"java.lang.UnsatisfiedLinkError

  • 異常完整描述
Exception in thread "main" java.lang.UnsatisfiedLinkError:org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
  • 異常追蹤資訊如下:
java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.access0(Ljava/lang/String;I)Z
    at org.apache.hadoop.io.nativeio.NativeIO
$Windows.access0(Native Method) at org.apache.hadoop.io.nativeio.NativeIO$Windows.access(NativeIO.java:609) at org.apache.hadoop.fs.FileUtil.canRead(FileUtil.java:980) at org.apache.hadoop.util.DiskChecker.checkAccessByFileMethods(DiskChecker.java:187) at org.apache.hadoop.util.DiskChecker
.checkDirAccess(DiskChecker.java:174) at org.apache.hadoop.util.DiskChecker.checkDir(DiskChecker.java:108) at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.confChanged(LocalDirAllocator.java:314) at org.apache.hadoop.fs.LocalDirAllocator$AllocatorPerContext.getLocalPathForWrite
(LocalDirAllocator.java:377) at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:151) at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:132) at org.apache.hadoop.fs.LocalDirAllocator.getLocalPathForWrite(LocalDirAllocator.java:116) at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:125) at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:171) at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:758) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:244) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1307) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1304) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1920) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1304) at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1325) ……
  • 該異常常見於windows環境中進行hadoop本地除錯的場景

通過追蹤資訊可以找到問題發生源是由於NativeIO.java:609的access報錯,程式碼如下:

 /**
     * Checks whether the current process has desired access rights on
     * the given path.
     * 
     * Longer term this native function can be substituted with JDK7
     * function Files#isReadable, isWritable, isExecutable.
     *
     * @param path input path
     * @param desiredAccess ACCESS_READ, ACCESS_WRITE or ACCESS_EXECUTE
     * @return true if access is allowed
     * @throws IOException I/O exception on error
     */
    public static boolean access(String path, AccessRight desiredAccess)
        throws IOException {
      return access0(path, desiredAccess.accessRight());
    }

可以看出是通過系統核心判斷是否具有操作許可權的判斷功能,由於windows本身的許可權設定此方法常會丟擲異常。

  • 解決方法1:臨時測試修改原始碼

找到NativeIO的原始碼,在測試工程內建立同路徑檔案,複製全部程式碼,修改access方法如下所示:

   public static boolean access(String path, AccessRight desiredAccess)
        throws IOException {
        return true;
      //return access0(path, desiredAccess.accessRight());
    }

通過設定恆允許來跳過該問題。

  • 解決方法2:使用遠端linux叢集協助除錯

由於問題發生源是windows的關係,只需藉助遠端linux環境下的(或配置好的windows 環境)hadoop叢集進行遠端除錯即可。
解決方法即為將遠端環境的hadoop配置檔案(如下)加入除錯工程的環境變數中。

  • core-site.xml
  • hdfs-site.xml
  • yarn-site.xml (非任務除錯可無)
  • mapred-site.xml (一般可無)