您好,登錄后才能下訂單哦!
小編給大家分享一下HDFS基本常用命令有哪些,相信大部分人都還不怎么了解,因此分享這篇文章給大家參考一下,希望大家閱讀完這篇文章后大有收獲,下面讓我們一起去了解一下吧!
一:查看HDFS文件的最后修改時間
public class Test6GetLTime { /** * 查看HDFS文件的最后修改時間 * */ public static void main(String[] args) { try { Configuration conf = new Configuration(); URI uri = new URI("hdfs://192.168.226.129:9000"); FileSystem fs = FileSystem.get(uri, conf); Path dfs = new Path("hdfs://192.168.226.129:9000/"); FileStatus fileStatus = fs.getFileStatus(dfs); long modificationTime = fileStatus.getModificationTime(); System.out.println( "Modefication time is: " + modificationTime ); } catch (IllegalArgumentException e) { e.printStackTrace(); } catch (URISyntaxException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } }
二:查找某個文件在HDFS集群的位置
public class Test7FileLocation { /** * 查找某個文件在HDFS集群的位置 * */ public static void main(String[] args) { try { Configuration conf = new Configuration(); URI uri = new URI("hdfs://192.168.226.129:9000"); FileSystem fs = FileSystem.get(uri, conf); Path dfs = new Path("hdfs://192.168.226.129:9000/rootdir/ssh.txt"); FileStatus fileStatus = fs.getFileStatus(dfs); BlockLocation[] blkLocations = fs.getFileBlockLocations(fileStatus,0, fileStatus.getLen() ); int blockLen = blkLocations.length; System.out.println("blockLen of length : " +blockLen ); for( int i=0;i<blockLen; i++){ String[] hosts = blkLocations[i].getHosts(); System.out.println("Block " + i +" Location: " + hosts[i]); } } catch (IllegalArgumentException e) { e.printStackTrace(); } catch (URISyntaxException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } }
三: 獲取HDFS集群上所有節點名稱
public class Test8GetList { /** * 獲取HDFS集群上所有節點名稱: * */ public static void main(String[] args) { try { Configuration conf = new Configuration(); URI uri = new URI("hdfs://192.168.226.129:9000"); FileSystem fs = FileSystem.get(uri,conf); DistributedFileSystem hdfs = (DistributedFileSystem)fs; DatanodeInfo[] dataNodeStats = hdfs.getDataNodeStats(); String[] names = new String[dataNodeStats.length]; int dataNodeLen = dataNodeStats.length; for( int i=0; i<dataNodeLen;i++){ names[i] = dataNodeStats[i].getHostName(); System.out.println("Node " + i + " Name: "+ names[i] ); } } catch (URISyntaxException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } }
四: 上傳一個視頻文件至HDFS,( 非實時視頻流)
public class UploadLive { public static void main(String[] args) { try { Configuration conf = new Configuration(); URI uri = new URI("hdfs://192.168.226.129:9000"); FileSystem fs = FileSystem.get(uri, conf); FileSystem local = FileSystem.getLocal(conf); //確定需要上傳視頻流路徑和接收視頻流路徑 Path inputDir = new Path("F:\\AHadoopTestFile"); Path hdfsFile = new Path("hdfs://192.168.226.129:9000/testhadoop/acceptLiveFile"); System.out.println( inputDir.toString()); //創建HDFS上 "acceptLiveFile" 目錄 用來接收視頻文件 boolean isExist = fs.exists( hdfsFile ); if( !isExist ){ fs.mkdirs(hdfsFile); System.out.println(" 創建新的目錄文件成功..."); } FileStatus[] inputFiles = local.listStatus(inputDir); FSDataOutputStream out; //通過OutputStream.write()來將視頻文件寫入HDFS下的指定目錄: int inputFileslen = inputFiles.length; for( int i=0;i<inputFileslen;i++){ System.out.println( inputFiles[i].getPath().getName() ); FSDataInputStream in = local.open(inputFiles[i].getPath() ); out = fs.create( new Path("hdfs://192.168.226.129:9000/testhadoop/acceptLiveFile/"+inputFiles[i].getPath().getName())); byte [] buffer = new byte[1024]; int byteRead = 0; while((byteRead = in.read(buffer))>0 ){ out.write(buffer,0,byteRead); } out.close(); in.close(); File file = new File( inputFiles[i].getPath().toString()); file.delete(); } } catch (IllegalArgumentException e) { e.printStackTrace(); } catch (FileNotFoundException e) { e.printStackTrace(); } catch (URISyntaxException e) { e.printStackTrace(); } catch (IOException e) { e.printStackTrace(); } } }
以上是“HDFS基本常用命令有哪些”這篇文章的所有內容,感謝各位的閱讀!相信大家都有了一定的了解,希望分享的內容對大家有所幫助,如果還想學習更多知識,歡迎關注億速云行業資訊頻道!
免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。