Insert data using HBase shell put Command and Examples

  • Post author:
  • Post last modified:June 6, 2019
  • Post category:BigData
  • Reading time:4 mins read

The set of HBase basic operations are referred to as CRUD operations. i.e. create, read, update, delete operations. HBase Create operation is nothing but put command. The put command is used to insert the data into HBase tables. In this article, we will check how to insert data using HBase shell put command.

Insert data using HBase shell put command

In this article, we have concentrated only on shell commands.

Consider the below table that we are going to create in HBase for ‘Insert data using HBase shell put command’ implementation.

Read:

Row personal_data:name personal_data:city personal_data:age
1 Ram Bengaluru 25

In the above table, personal_data is column family name, name, and age are column names.

HBase put command Syntax

Below is the syntax for creating data or inserting data into HBase tables:

put '<HBase_table_name>', 'row_key', '<colfamily:colname>', '<value>'

HBase put command Example

Below is the example of inserting rows using HBase put command:

hbase(main):012:0> put 'personal',1,'personal_data:name','Ram'
0 row(s) in 0.0070 seconds
hbase(main):013:0> put 'personal',1,'personal_data:city','Bengaluru'
0 row(s) in 0.0070 seconds
hbase(main):014:0> put 'personal',1,'personal_data:age','25'
0 row(s) in 0.0070 seconds

hbase(main):015:0> scan 'personal'
ROW COLUMN+CELL
 1 column=personal_data:age, timestamp=1505285659934, value=25
 1 column=personal_data:city, timestamp=1505285653043, value=Bengaluru
 1 column=personal_data:name, timestamp=1505285635428, value=Ram
1 row(s) in 0.0130 seconds

hbase(main):016:0>

Load CSV file to HBase Table

Below is the example that allows you to load data from hdfs file to HBase table. You must copy the local file to the hdfs folder then you can load that to HBase table.

$ hbase org.apache.hadoop.hbase.mapreduce.ImportTsv -Dimporttsv.separator=',' -Dimporttsv.columns=HBASE_ROW_KEY, personal_data:name, personal_data:city, personal_data:age personal /test

The above command will generate the MapReduce job to load data from CSV file to HBase table. Below is the out of table after inserting data from CSV to HBase table.

hbase(main):002:0> scan 'personal'
ROW COLUMN+CELL
 1 column=personal_data:age, timestamp=1505285659934, value=25
 1 column=personal_data:city, timestamp=1505285653043, value=Bengaluru
 1 column=personal_data:name, timestamp=1505285635428, value=Ram
 2 column=personal_data:age, timestamp=1505286495492, value=24
 2 column=personal_data:city, timestamp=1505286495492, value=Bengaluru
 2 column=personal_data:name, timestamp=1505286495492, value=sham
 3 column=personal_data:age, timestamp=1505286495492, value=27
 3 column=personal_data:city, timestamp=1505286495492, value=New Delhi
 3 column=personal_data:name, timestamp=1505286495492, value=Guru
 4 column=personal_data:age, timestamp=1505286495492, value=26
 4 column=personal_data:city, timestamp=1505286495492, value=NY
 4 column=personal_data:name, timestamp=1505286495492, value=John
 5 column=personal_data:age, timestamp=1505286495492, value=30
 5 column=personal_data:city, timestamp=1505286495492, value=DC
 5 column=personal_data:name, timestamp=1505286495492, value=Rock
5 row(s) in 0.0510 seconds

Read:

This Post Has 2 Comments

  1. Abhi

    How to insert double quotes value in the hbase table using put command?
    for example: In the above example I want to insert “Bengaluru” instead of Bengaluru.

    1. Vithal S

      Hi,

      Currently, there is no option to load Quoted values in HBase. As a workaround, you can create Hive external table on top of HBase table and load quoted values.

      Thanks,

Comments are closed.