1. 程式人生 > >hdfs常用api(java)

hdfs常用api(java)

下載文件 lte tao rgs data open handler hostname 虛擬

1.下載文件到本地

public class HdfsUrlTest {

static{

//註冊url java程序識別hdfsurl

URL.setURLStreamHandlerFactory(new FsUrlStreamHandlerFactory());

}

public static void main(String[] args) {

InputStream in = null;

OutputStream out = null;

try {

String file = "hdfs://hadoop-yarn.test.com:8020/user/testi/conf/core-site.xml";

URL fileUrl = new URL(file);

in = fileUrl.openStream();

out = new FileOutputStream(new File("d:/core-site.xml"));

//下載文件到本地

IOUtils.copyBytes(in,out,4096, false);

} catch (Exception e) {

e.printStackTrace();

}finally{

IOUtils.closeStream(in);}}}

2.查看集群信息

public static void cluserStatus() throws Exception{

FileSystem fs = HdfsUtil.getFs();

DistributedFileSystem dfs = (DistributedFileSystem)fs;

FsStatus fss = dfs.getStatus();

DatanodeInfo[] datanodeInfos = dfs.getDataNodeStats();

for(DatanodeInfo datanodeinfo : datanodeInfos){

System.out.println(datanodeinfo.getHostName());}}

3.創建一個文件夾

public void testHDFSMkdir() throws Exception{

String hdfsUrl = "hdfs://192.168.10.11:8020";

Configuration conf= new Configuration();

FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);

Path path = new Path("/bigdata");

fs.mkdirs(path);}

4.創建一個文件

public void testCreateFile() throws Exception{

String hdfsUrl = "hdfs://192.168.10.11:8020";

Configuration conf= new Configuration();

FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);

Path path = new Path("/bigdata/a.txt");

FSDataOutputStream out = fs.create(path);

out.write("hello hadoop".getBytes());}

5.文件重命名

public void testRenameFile() throws Exception{

String hdfsUrl = "hdfs://192.168.10.11:8020";

Configuration conf= new Configuration();

FileSystem fs = FileSystem.get(URI.create(hdfsUrl),conf);

Path path = new Path("/bigdata/a.txt");

Path newPath = new Path("/bigdata/b.txt");

System.out.println(fs.rename(path, newPath));}

6.上傳文件

public static void write() throws Exception{

FileSystem fs = HdfsUtil.getFs();

OutputStream outStream = fs.create(new Path("/user/lcc/conf/put-core-site.xml"));

FileInputStream inStream = new FileInputStream(new File("d:/core-site.xml"));

IOUtils.copyBytes(inStream, outStream, 4096, true);}

Hadoop程序讀取hdfs中的數據,最簡單的方法是使用javaURL對象打開一個數據流,並從中讀取數據

需要一個fsUrlStreamHandlerFactory實例調用set過的一個URL這種方法java虛擬機只能調用一次,缺點是如果程序的其他部分也設置了這個,會導致無法再從hadoop中讀取數據。

方法,需要使用filesystemapi打開一個文件的輸入流。

文件hadoop文件系統中被視為一個hadoop path對象,把一個文件夾或文件路徑看做為一個hadoop文件系統的URL

三種方法。

Getconf和geturiconf,newInstance

hdfs常用api(java)