There are 3 Ways to Load Data From HDFS to HBase
1.Using ImportTsv to load txt to HBase
a) Create table in hbase
command:
create ‘tab3′,’cf’
b) Uploading simple1.txt to HDFS
bin/hadoop fs -copyFromLocal simple1.txt /user/hadoop/simple1.txt
The context in the txt is:
1,tom
2,sam
3,jerry
4,marry
5,john
c) Using ImportTsv to load txt to HBase
bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=”,”
-Dimporttsv.columns=HBASE_ROW_KEY,cf tab4 /user/hadoop/simple1.txt
ImportTsv execute result:
2.Using completebulkload to load txt to HBase
a) creating table in hbase
create ‘hbase-tb1-003′,’cf’
b) Using ImportTsv to generate HFile for txt in HDFS
bin/hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=”,”
-Dimporttsv.bulk.output=hfile_tmp5 -Dimporttsv.columns=HBASE_ROW_KEY,cf hbase-tbl-003 /user/hadoop/simple1.txt
This command will be executed by MapReduce job:
As a result, the Hfile hfile_tmp5 is generated.
But the data wasn’t loaded into the Hbase table: hbase-tb1-003.
3.Using completebulkload to load Hfile to HBase hadoop jar lib/hbase-server-0.98.13-hadoop2.jar completebulkload hfile_tmp5 hbase-tbl-003
Result:
To know more, It's recommended to join Big Data Certification today.