Hive数据导出,hive数据导出mysql


Hadoop命令的方式:get、text

hive> dfs -get /user/hive/warehouse/testtable/* /liguodong/dataimport;
hive> !ls /liguodong/dataimport/;
datatest
datatest_copy_1

通过INSERT … DIRECTORY方式

insert overwrite [local] directory '/tmp/ca_employees' 
[row format delimited fields terminated by '\t']
select name,salary,address from employee
导出到本地目录
insert overwrite local directory '/liguodong/hiveimport'
select name,addr
from  testtable;

有分隔符
insert overwrite local directory '/liguodong/hiveimport'
row format delimited fields terminated by '\t' 
select name,addr
from  testtable;

导出到hdfs目录 (不允许row format)
insert overwrite directory '/liguodong/hiveimport'
select name,addr
from  testtable;

Shell命令加管道

hive -f/e | sed/grep/awk > file

hive -S -e "select * from testtable" | grep liguodong

第三方工具Sqoop

将数据导出到mysql中

sqoop export --connect jdbc:mysql://hadoop1:3306/qt --username root --password liguodong 
--table resultshow --fields-terminated-by '\001' 
--export-dir '/user/hive/warehouse/addressall_2015_07_09'

版权声明:本文为博主原创文章,未经博主允许不得转载。

相关内容