测试环境:phoenix4.14.1 hbase1.1.1 hive2.3.3 hadoop2.7.2
(由于不是通过CDH/Ambari搭建,所以会出现组件的兼容性问题)
聊一聊
- 为什么要使用hive映射phoenix?
将数据存储Phoenix,本质是HBase提供的SQL。用于前端的查询,要求RT在秒级以内。phoenix作为hbase二级索引的最佳组合,测试过上百万级别的数据构建二级索引最快的能在毫秒内返回。另外还有数据的统计分析及数据处理、机器学习建模,当然可以直接操作phoenix或spark+phoenix。对于离线的建模业务,我们希望使用hive做更复杂的处理。
如何整合phoenix
参考官网:http://phoenix.apache.org/hive_storage_handler.html
实际验证根据官网只使用phoenix-xxx-hbase-xxx-hive.jar
是不足够的,下面做说明。
Step 1: 添加phoenix-xxx-hbase-xxx-hive.jar、phoenix-xxx-hbase-xxx-client.jar 、phoenix-core-xxx-HBase-xxx.jar三个jar到hive环境(此步修改需同步客户端)
方式一:直接添加改jar包到hive/lib目录下
方式二:添加jar到hive/lib-aux(没有则新建),修改hive/conf下的hive-env.sh添加如下环境变量
export HIVE_AUX_JARS_PATH=${HIVE_HOME}/lib-aux
Step 2: 在hive-site.xml添加如下属性,使得hive的MR能够使用phoenix的jar包.若不在hive-site.xml添加改属性,当hive语句执行使用mr时(如有order by的语句)依然会报Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.hbase.ipc.controller.ServerRpcControllerFactory
错误。
<property>
<name>hive.aux.jars.path</name>
<value>${HIVE_HOME}/lib-aux</value>
</property>
注意(QUESTION):
- 由于phoenix jar中和hive/lib中的icu4j jar冲突,启动hive服务时会发生如下异常:
Exception in thread "main" java.lang.NoSuchMethodError: com.ibm.icu.impl.ICUBinary.getRequiredData(Ljava/lang/String;)Ljava/nio/ByteBuffer;
解决方案:移除hive/lib下的icu4j-4.8.1.jar- 当hive语句执行使用mr时报如下错误
FAILED: Execution Error, return code -101 from org.apache.hadoop.hive.ql.exec.mr.MapRedTask. HADOOP_CLASSPATH
原因:根据官网只使用phoenix-xxx-hbase-xxx-hive.jar
是不足够的,还需要添加phoenix-xxx-hbase-xxx-client.jar 、phoenix-core-xxx-HBase-xxx.jar
- 由于hbase中通过HBase命令行单独启动异步填充索引MR(run on yarn)时报异常升级了升级disruptor-3.3.0到disruptor-3.3.11造成hive/lib中的 disruptor-3.3.0.jar本版冲突
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. java.lang.NoSuchMethodError: com.lmax.disruptor.dsl.Disruptor.<init>(Lcom/lmax/disruptor/EventFactory;ILjava/util/concurrent/ThreadFactory;Lcom/lmax/disruptor/dsl/ProducerType;Lcom/lmax/disruptor/WaitStrategy;)
解决方案:同样升级disruptor到3.3.11,或者直接在hive/lib下移除disruptor。(经测试好像只升级或者移除客户端的disruptor也能够解决,这里为了统一把端的也一起升级了)
验证结果
- 创建phoenix表
create table IF NOT EXISTS TEST.TEST_PHOENIX (ID INTEGER not null primary key, name varchar,age INTEGER,score decimal, create_time timestamp,status tinyint, date_time time);
UPSERT INTO TEST.TEST_PHOENIX(id,name,age,score,create_time,status,date_time) VALUES(1,'foo',123,999.8,timestamp '2019-01-30 18:15:00',1,time '2019-01-30 18:30:00');
SELECT * FROM TEST.TEST_PHOENIX LIMIT 10;
- 创建hive外部表
create external table TEST_PHOENIX (
id int,
name string,
age int,
score decimal
)
STORED BY 'org.apache.phoenix.hive.PhoenixStorageHandler'
TBLPROPERTIES (
"phoenix.table.name" = "TEST.TEST_PHOENIX",
"phoenix.zookeeper.quorum" = "hdc-data1,hdc-data2,hdc-data3",
"phoenix.zookeeper.znode.parent" = "/hbase",
"phoenix.zookeeper.client.port" = "2181",
"phoenix.rowkeys" = "id",
"phoenix.column.mapping" = "id:ID, name:NAME, age:AGE, score:SCORE"
);
网友评论