中文字幕av专区_日韩电影在线播放_精品国产精品久久一区免费式_av在线免费观看网站

溫馨提示×

溫馨提示×

您好,登錄后才能下訂單哦!

密碼登錄×
登錄注冊×
其他方式登錄
點擊 登錄注冊 即表示同意《億速云用戶服務條款》

Hive數據怎么導入導出

發布時間:2021-08-23 21:05:44 來源:億速云 閱讀:136 作者:chen 欄目:云計算

本篇內容主要講解“Hive數據怎么導入導出”,感興趣的朋友不妨來看看。本文介紹的方法操作簡單快捷,實用性強。下面就讓小編來帶大家學習“Hive數據怎么導入導出”吧!

一、     從文件系統導入

數據源存放路徑: /root/data

hive> load data local inpath "/root/data" overwrite intotable t1; Loading data to table default.t1Table default.t1 stats: [numFiles=1, numRows=0, totalSize=30,rawDataSize=0]OKTime taken: 1.712 secondshive> select * from t1;OKzhangsan        25lisi    27

wangwu  24

二、     從HDFS導入

Hdfs數據存放位置

[root@crxy177 ~]# hadoop dfs-ls /

-rw-r--r--   1 root supergroup         30 2015-05-18 10:39 /data

hive> load data inpath"/data" overwrite into table t1; Loading data to table default.t1Moved:'hdfs://192.168.1.177:9000/user/hive/warehouse/t1/data' to trash at:hdfs://192.168.1.177:9000/user/root/.Trash/CurrentTable default.t1 stats: [numFiles=1,numRows=0, totalSize=30, rawDataSize=0]OKTime taken: 1.551 seconds三、     通過查詢導入

創建一張表

hive> create table t2 like t1;

OK

Time taken: 0.246 seconds

導入數據

hive> insert overwrite table t2 select * form t1;

FAILED: NullPointerException null

hive> insert overwrite table t2 select * from t1;

Query ID = root_20150518104747_7922f9d4-2e15-434a-8b9f-076393d73470

Total jobs = 3

Launching Job 1 out of 3

Number of reduce tasks is set to 0 since there's no reduce operator

Starting Job = job_1431916152610_0001, Tracking URL = http://crxy177:8088/proxy/application_1431916152610_0001/

Kill Command = /usr/local/hadoop-2.6.0/bin/hadoop job  -kill job_1431916152610_0001

Interrupting... Be patient, this might take some time.

Press Ctrl+C again to kill JVM

killing job with: job_1431916152610_0001

Hadoop job information for Stage-1: number of mappers: 0; number ofreducers: 0

2015-05-18 10:47:40,679 Stage-1 map = 0%,  reduce = 0%

Ended Job = job_1431916152610_0001 with errors

Error during job, obtaining debugging information...

FAILED: Execution Error, return code 2 fromorg.apache.hadoop.hive.ql.exec.mr.MapRedTask

MapReduce Jobs Launched:

Stage-Stage-1:  HDFS Read: 0HDFS Write: 0 FAIL

Total MapReduce CPU Time Spent: 0 msec

四、     多表同時導入

創建t3,t4表

hive> createtable t3 like t1;

OK

Time taken:1.235 seconds

hive> createtable t4 like t1;

OK

Time taken:0.211 seconds

多表數據導入

hive> FROM t1

    > INSERT OVERWRITE TABLE t2 SELECT * WHERE 1=1

    > INSERT OVERWRITE TABLE t3 SELECT * WHERE 1=1

    > INSERT OVERWRITE TABLE t4 SELECT * WHERE 1=1;

Query ID =root_20150518105252_9101659d-0990-4626-a4f7-8bad768af48b

Total jobs = 7

Launching Job 1out of 7

Number of reducetasks is set to 0 since there's no reduce operator

Starting Job =job_1431916152610_0002, Tracking URL = http://crxy177:8088/proxy/application_1431916152610_0002/

Kill Command =/usr/local/hadoop-2.6.0/bin/hadoop job -kill job_1431916152610_0002

Hadoop jobinformation for Stage-3: number of mappers: 1; number of reducers: 0

2015-05-1810:52:50,866 Stage-3 map = 0%,  reduce =0%

2015-05-1810:53:02,273 Stage-3 map = 100%,  reduce= 0%, Cumulative CPU 1.41 sec

MapReduce Totalcumulative CPU time: 1 seconds 410 msec

Ended Job =job_1431916152610_0002

Stage-6 isselected by condition resolver.

Stage-5 isfiltered out by condition resolver.

Stage-7 isfiltered out by condition resolver.

Stage-12 isselected by condition resolver.

Stage-11 isfiltered out by condition resolver.

Stage-13 isfiltered out by condition resolver.

Stage-18 isselected by condition resolver.

Stage-17 isfiltered out by condition resolver.

Stage-19 isfiltered out by condition resolver.

Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10000

Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10002

Moving data to:hdfs://192.168.1.177:9000/tmp/hive/root/88e075ab-e7da-497d-a56b-74f652f3eae6/hive_2015-05-18_10-52-30_865_4936011539493382740-1/-ext-10004

Loading data totable default.t2

Loading data totable default.t3

Loading data totable default.t4

Table default.t2stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]

Table default.t3stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]

Table default.t4stats: [numFiles=1, numRows=0, totalSize=30, rawDataSize=0]

MapReduce JobsLaunched:

Stage-Stage-3:Map: 1   Cumulative CPU: 1.41 sec   HDFS Read: 237 HDFS Write: 288 SUCCESS

Total MapReduceCPU Time Spent: 1 seconds 410 msec

OK

Time taken:34.245 seconds

到此,相信大家對“Hive數據怎么導入導出”有了更深的了解,不妨來實際操作一番吧!這里是億速云網站,更多相關內容可以進入相關頻道進行查詢,關注我們,繼續學習!

向AI問一下細節

免責聲明:本站發布的內容(圖片、視頻和文字)以原創、轉載和分享為主,文章觀點不代表本網站立場,如果涉及侵權請聯系站長郵箱:is@yisu.com進行舉報,并提供相關證據,一經查實,將立刻刪除涉嫌侵權內容。

AI

囊谦县| 友谊县| 庆安县| 临沂市| 古蔺县| 晴隆县| 涟源市| 交城县| 平罗县| 忻城县| 封开县| 灵宝市| 康保县| 墨玉县| 新乡市| 邢台县| 神木县| 滕州市| 馆陶县| 宁津县| 上思县| 乐亭县| 临安市| 惠来县| 黔西| 阿拉尔市| 清新县| 商河县| 筠连县| 濮阳县| 安溪县| 翼城县| 临夏县| 沈丘县| 桦川县| 吉木萨尔县| 库伦旗| 揭西县| 乌兰浩特市| 包头市| 珠海市|