1. 程式人生 > >訪問HDFS報錯:org.apache.hadoop.security.AccessControlException: Permission denied

訪問HDFS報錯:org.apache.hadoop.security.AccessControlException: Permission denied

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;

public class TestHDFS {
    public static void main(String[] args) throws Exception{
        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://192.168.0.104:9000");
        FileSystem fs 
= FileSystem.get(conf); //存在的情況下會覆蓋之前的目錄 boolean success = fs.mkdirs(new Path("/xiaol")); System.out.println(success); } }

 

Exception in thread "main" org.apache.hadoop.security.AccessControlException: Permission denied: user=xiaol, access=WRITE, inode="/xiaol":root:supergroup:drwxr-xr-x

 

網上的方法:

  1.在hdfs的配置檔案中,將dfs.permissions.enabled修改為False

  2.hadoop fs -chmod 777 /

我覺得這倆方法都是屎


hadoop在訪問hdfs的時候會進行許可權認證,取使用者名稱的過程是這樣的:

讀取HADOOP_USER_NAME系統環境變數,如果不為空,那麼拿它作username,如果為空

讀取HADOOP_USER_NAME這個java環境變數,如果為空

從com.sun.security.auth.NTUserPrincipal或者com.sun.security.auth.UnixPrincipal的例項獲取username。

如果以上嘗試都失敗,那麼丟擲異常LoginException("Can’t find user name")

 

解決方案:

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.FileSystem;
import org.apache.hadoop.fs.Path;
import java.util.Properties;

public class TestHDFS {
    public static void main(String[] args) throws Exception{
        Properties properties = System.getProperties();
        properties.setProperty("HADOOP_USER_NAME", "root");

        Configuration conf = new Configuration();
        conf.set("fs.defaultFS", "hdfs://192.168.0.104:9000");
        FileSystem fs = FileSystem.get(conf);

        //存在的情況下會覆蓋之前的目錄
        boolean success = fs.mkdirs(new Path("/xiaol"));
        System.out.println(success);
    }
}