美文网首页小卜java
Hadoop系列-HBase数据库JAVA篇

Hadoop系列-HBase数据库JAVA篇

作者: 汤太咸啊 | 来源:发表于2022-04-02 16:34 被阅读0次

一、Hadoop Hbase数据库JAVA集成
前面都是在咱们通过hbase shell来执行的,今天咱们引入到java项目中来。

咱们接下来看看如何操作。

注意咱们最初的启动docker的镜像时候,重点需要把固定的端口暴露出来,hbase需要使用的包括

zookeeper(HBase自带,不需要手动安装)的2181端口,hbase的16000,16201,下面是咱们最开始部署hadoop的时候的映射端口到宿主机上

docker run -p 2888:2888 -p 3888:3888 -p 2181:2181 -p 9000:9000 -p 50070:50070 -p 8485:8485 -p 9870:9870 -p 8042:8042 -p 50010:50010 -p 16000:16000 -p 16201:16201 -p 8088:8088 -p 60000:60000 -p 60011:60010 -p 60020:60020 -p 60030:60030 -p 8080:8080 -p 9090:9090  --name master -d -h master sequenceiq/hadoop-docker

二、引入pom包
由于我的hbase安装的1.7.1版本,因此引入的pom也需要是1.7.1版本

<dependency>
    <groupId>org.apache.hbase</groupId>
    <artifactId>hbase-client</artifactId>
    <version>1.7.1</version>
</dependency>

三、创建表

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HColumnDescriptor;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;

public class CreateTable {
    public static void main(String[] args) throws Exception{
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        try{
            Admin admin = connection.getAdmin();
            HTableDescriptor tableName = new HTableDescriptor(TableName.valueOf("census"));
            tableName.addFamily(new HColumnDescriptor("personal"));
            tableName.addFamily(new HColumnDescriptor("professional"));
            if(!admin.tableExists(tableName.getTableName())){
                System.out.println("开始创建表");
                admin.createTable(tableName);
                System.out.println("创建表完成");
            }else{
                System.out.println("表已经存在");
            }
        }finally {
            connection.close();
        }
    }
}
//运行输出
开始创建表

创建表完成
java执行完毕后,到hbase shell去查看table已经建立完毕了

hbase(main):003:0> list
TABLE                                                                                                                                                                                                                                                                         
census                                                                                                                                                                                                                                                                        
1 row(s) in 0.1870 seconds

四、Put数据

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.client.Table;
import org.apache.hadoop.hbase.util.Bytes;

import java.util.ArrayList;
import java.util.List;

public class SimplePut {
    private static byte[] PERSONAL_CF = Bytes.toBytes ( "personal" ) ;
    private static byte[] PROFESSIONAL_CF= Bytes. toBytes ( "professional") ;
    private static byte[] NAME_COLUMN = Bytes.toBytes ( "name") ;
    private static byte[] GENDER_COLUMN = Bytes. toBytes ( "gender") ;
    private static byte[] MARITAL_STATUS_COLUMN = Bytes.toBytes ("marital_status") ;
    private static byte[] EMPLOYED_COLUMN = Bytes.toBytes ( "employed" ) ;
    private static byte[] FIELD_COLUMN = Bytes. toBytes("field");
    public static void main(String[] args) throws Exception {
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        Table table = null;
        try{
            table = connection.getTable(TableName.valueOf("census"));
            Put put1 = new Put(Bytes.toBytes("1"));
            put1.addColumn(PERSONAL_CF,NAME_COLUMN,Bytes.toBytes("章三"));
            put1.addColumn(PERSONAL_CF,GENDER_COLUMN,Bytes.toBytes("男"));
            put1.addColumn(PERSONAL_CF,MARITAL_STATUS_COLUMN,Bytes.toBytes("已婚"));
            put1.addColumn(PROFESSIONAL_CF,EMPLOYED_COLUMN,Bytes.toBytes("是"));
            put1.addColumn(PROFESSIONAL_CF,FIELD_COLUMN,Bytes.toBytes("constraction"));
            table.put(put1);
            System.out.println("章三 插入hbase完成");
            Put put2 = new Put(Bytes.toBytes("2"));
            put2.addColumn(PERSONAL_CF,NAME_COLUMN,Bytes.toBytes("里斯"));
            put2.addColumn(PERSONAL_CF,GENDER_COLUMN,Bytes.toBytes("女"));
            put2.addColumn(PERSONAL_CF,MARITAL_STATUS_COLUMN,Bytes.toBytes("已婚"));
            put2.addColumn(PROFESSIONAL_CF,EMPLOYED_COLUMN,Bytes.toBytes("是"));
            put2.addColumn(PROFESSIONAL_CF,FIELD_COLUMN,Bytes.toBytes("constraction1"));
            Put put3 = new Put(Bytes.toBytes("3"));
            put3.addColumn(PERSONAL_CF,NAME_COLUMN,Bytes.toBytes("王3"));
            put3.addColumn(PERSONAL_CF,GENDER_COLUMN,Bytes.toBytes("女"));
            List<Put> list = new ArrayList<>();
            list.add(put2);
            list.add(put3);
            table.put(list);
            System.out.println("批量 插入hbase完成");
        }finally {
            connection.close();
            if(table!=null) {
                table.close();
            }
        }
    }
}
//输出
章三 插入hbase完成
批量 插入hbase完成

到hbase shell查看,注意表里都是byte,不能展示中文,后期通过java读取出来没有问题

hbase(main):001:0> scan 'census'
ROW    COLUMN+CELL                                   
 1     column=personal:gender, timestamp=1648107751834, value=\xE7\x94\xB7          
 1     column=personal:marital_status, timestamp=1648107751834, value=\xE5\xB7\xB2\xE5\xA9\x9A                     
 1     column=personal:name, timestamp=1648107751834, value=\xE7\xAB\xA0\xE4\xB8\x89                               
 1     column=professional:employed, timestamp=1648107751834, value=\xE6\x98\xAF    
 1     column=professional:field, timestamp=1648107751834, value=constraction       
 2     column=personal:gender, timestamp=1648107751855, value=\xE5\xA5\xB3          
 2     column=personal:marital_status, timestamp=1648107751855, value=\xE5\xB7\xB2\xE5\xA9\x9A                     
 2     column=personal:name, timestamp=1648107751855, value=\xE9\x87\x8C\xE6\x96\xAF                               
 2     column=professional:employed, timestamp=1648107751855, value=\xE6\x98\xAF    
 2     column=professional:field, timestamp=1648107751855, value=constraction1      
 3     column=personal:gender, timestamp=1648107751855, value=\xE5\xA5\xB3          
 3     column=personal:name, timestamp=1648107751855, value=\xE7\x8E\x8B3 

五、Get数据

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;

import java.util.ArrayList;
import java.util.List;

public class SimpleGet {
    private static byte[] PERSONAL_CF = Bytes.toBytes ( "personal" ) ;
    private static byte[] PROFESSIONAL_CF= Bytes. toBytes ( "professional") ;
    private static byte[] NAME_COLUMN = Bytes.toBytes ( "name") ;
    private static byte[] FIELD_COLUMN = Bytes. toBytes("field");
    public static void main(String[] args) throws Exception {
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        Table table = null;
        try{
            table = connection.getTable(TableName.valueOf("census"));
            Get get = new Get(Bytes.toBytes("1"));
            get.addColumn(PERSONAL_CF,NAME_COLUMN);
            get.addColumn(PROFESSIONAL_CF,FIELD_COLUMN);
            Result result = table.get(get);
            byte[] nameValue = result.getValue(PERSONAL_CF,NAME_COLUMN);
            byte[] filedValue = result.getValue(PROFESSIONAL_CF,FIELD_COLUMN);
            System.out.println("Get result name: "+ Bytes.toString(nameValue) +";file: "+ Bytes.toString(filedValue));
            Get get1 = new Get(Bytes.toBytes("2"));
            get1.addColumn(PERSONAL_CF,NAME_COLUMN);
            Get get2 = new Get(Bytes.toBytes("3"));
            get2.addColumn(PERSONAL_CF,NAME_COLUMN);
            List<Get> gets = new ArrayList<>();
            gets.add(get1);
            gets.add(get2);
            Result[] results = table.get(gets);
            for (Result r:results) {
                byte[] nameValue1 = r.getValue(PERSONAL_CF,NAME_COLUMN);
                System.out.println("Batch Get result name: "+ Bytes.toString(nameValue1));
            }
        }finally {
            connection.close();
            if(table!=null) {
                table.close();
            }
        }
    }
}
//输出
Get result name: 章三;file: constraction
Batch Get result name: 里斯
Batch Get result name: 王3

六、Scan数据

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.Cell;
import org.apache.hadoop.hbase.CellUtil;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;

public class SimpleScan {
    public static void main(String[] args) throws Exception {
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        Table table = null;
        ResultScanner scanResult = null;
        try{
            table = connection.getTable(TableName.valueOf("census"));
            Scan scan = new Scan();
            scanResult = table.getScanner(scan);
            for (Result result:scanResult) {
                for (Cell cell:result.listCells()) {
                    String row = new String(CellUtil.cloneRow(cell));
                    String family = new String(CellUtil.cloneFamily(cell));
                    String column = new String(CellUtil.cloneQualifier(cell));
                    String value = new String(CellUtil.cloneValue(cell));
                    System.out.println("row: "+ row + " family: "+family+ " column: "+column+ " value: "+value);
                }
            }
        }finally {
            connection.close();
            if(table!=null) {
                table.close();
            }
        }
    }
}
//输出
row: 1 family: personal column: gender value: 男
row: 1 family: personal column: marital_status value: 已婚
row: 1 family: personal column: name value: 章三
row: 1 family: professional column: employed value: 是
row: 1 family: professional column: field value: constraction
row: 2 family: personal column: gender value: 女
row: 2 family: personal column: marital_status value: 已婚
row: 2 family: personal column: name value: 里斯
row: 2 family: professional column: employed value: 是
row: 2 family: professional column: field value: constraction1
row: 3 family: personal column: gender value: 女
row: 3 family: personal column: name value: 王3

七、Delete数据

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.*;
import org.apache.hadoop.hbase.util.Bytes;

public class SimpleDelete {
    private static byte[] PERSONAL_CF = Bytes.toBytes ( "personal" ) ;
    private static byte[] PROFESSIONAL_CF= Bytes. toBytes ( "professional") ;
    private static byte[] MARITAL_STATUS_COLUMN = Bytes.toBytes ("marital_status") ;
    private static byte[] FIELD_COLUMN = Bytes. toBytes("field");
    public static void main(String[] args) throws Exception {
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        Table table = null;
        try{
            table = connection.getTable(TableName.valueOf("census"));
            Delete delete = new Delete(Bytes.toBytes("1"));
            delete.addColumn(PERSONAL_CF,MARITAL_STATUS_COLUMN);
            delete.addColumn(PROFESSIONAL_CF,FIELD_COLUMN);
            table.delete(delete);
            System.out.println("delete 完成");
        }finally {
            connection.close();
            if(table!=null) {
                table.close();
            }
        }
    }
}
//输出
delete 完成

到hbase shell查看,可以看到id为1的,marital_status和field不存在了

hbase(main):005:0> scan 'census'
ROW    COLUMN+CELL                                   
 1     column=personal:gender, timestamp=1648111949216, value=\xE7\x94\xB7          
 1     column=personal:name, timestamp=1648111949216, value=\xE7\xAB\xA0\xE4\xB8\x89                               
 1     column=professional:employed, timestamp=1648111949216, value=\xE6\x98\xAF    
 2     column=personal:gender, timestamp=1648111949237, value=\xE5\xA5\xB3          
 2     column=personal:marital_status, timestamp=1648111949237, value=\xE5\xB7\xB2\xE5\xA9\x9A                     
 2     column=personal:name, timestamp=1648111949237, value=\xE9\x87\x8C\xE6\x96\xAF                               
 2     column=professional:employed, timestamp=1648111949237, value=\xE6\x98\xAF    
 2     column=professional:field, timestamp=1648111949237, value=constraction1      
 3     column=personal:gender, timestamp=1648111949237, value=\xE5\xA5\xB3          
 3     column=personal:name, timestamp=1648111949237, value=\xE7\x8E\x8B3 

八、Delete表

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.HTableDescriptor;
import org.apache.hadoop.hbase.TableName;
import org.apache.hadoop.hbase.client.Admin;
import org.apache.hadoop.hbase.client.Connection;
import org.apache.hadoop.hbase.client.ConnectionFactory;

public class DeleteTable {
    public static void main(String[] args) throws Exception{
        Configuration configuration = HBaseConfiguration.create();
        configuration.set("hbase.zookeeper.property.clientPort", "2181");
        configuration.set("hbase.zookeeper.quorum", "127.0.0.1");
        Connection connection = ConnectionFactory.createConnection(configuration);
        try{
            Admin admin = connection.getAdmin();
            HTableDescriptor tableName = new HTableDescriptor(TableName.valueOf("census"));
            if(admin.tableExists(tableName.getTableName())){
                System.out.println("开始删除表");
                admin.disableTable(tableName.getTableName());
                admin.deleteTable(tableName.getTableName());
                System.out.println("删除表完成");
            }else{
                System.out.println("表不存在");
            }
        }finally {
            connection.close();
        }
    }
}
//输出
开始删除表
删除表完成

再次到hbase shell查看,表已经不存在了

hbase(main):006:0> list
TABLE                                                                                                                                                                                                                                                                         
0 row(s) in 0.0150 seconds

=> []

写到这里,Hadoop的HBase部分正式完成,我在思考下接下来写Hadoop的哪一部分?加油,继续卷。

谢各位的阅读,谢谢您动动手指点赞,万分感谢各位。另外以下是我之前写过的文章,感兴趣的可以点进去继续阅读。

历史文章

Hadoop系列-入门安装
Hadoop系列-HDFS命令
Hadoop系列-Hive安装
Hadoop系列-Hive数据库常见SQL命令
Hadoop系列-HBase数据库
Hadoop系列-HBase数据库(二)
Hadoop系列-HBase数据库JAVA篇
Hadoop系列-Spark安装以及HelloWorld
Hadoop系列-MapReduce小例子
Hadoop系列-Spark小例子
JAVA面试汇总(五)数据库(一)
JAVA面试汇总(五)数据库(二)
JAVA面试汇总(五)数据库(三)
JAVA面试汇总(四)JVM(一)
JAVA面试汇总(四)JVM(二)
JAVA面试汇总(四)JVM(三)
JAVA面试汇总(三)集合(一)
JAVA面试汇总(三)集合(二)
JAVA面试汇总(三)集合(三)
JAVA面试汇总(三)集合(四)
JAVA面试汇总(二)多线程(一)
JAVA面试汇总(二)多线程(二)
JAVA面试汇总(二)多线程(三)
JAVA面试汇总(二)多线程(四)
JAVA面试汇总(二)多线程(五)
JAVA面试汇总(二)多线程(六)
JAVA面试汇总(二)多线程(七)
JAVA面试汇总(一)Java基础知识

相关文章

网友评论

    本文标题:Hadoop系列-HBase数据库JAVA篇

    本文链接:https://www.haomeiwen.com/subject/cegtsrtx.html