1,编译tez 见https://www.jianshu.com/p/b2569796dd27
2,将 编译后的tez-0.9.2.tar.gz 上传到hdfs上. tez-site.xml中会使用到.见tez.lib.uris属性.
3,在 $HADOOP_HOME/etc/hadoop 下新建 tez-site.xml.内容如下
<?xml version="1.0"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>tez.lib.uris</name>
<value>${fs.defaultFS}/apps/tez/tez-0.9.2.tar.gz</value>
</property>
<!-- 使用hadoop自身的lib包,设置为true的话可以使用minimal的tez包,false的话需要使用tez-0.9.2.tar.gz的包-->
<property>
<name>tez.use.cluster.hadoop-libs</name>
<value>false</value>
</property>
<property>
<name>tez.am.resource.memory.mb</name>
<value>1024</value>
</property>
<property>
<name>tez.am.resource.cpu.vcores</name>
<value>1</value>
</property>
<property>
<name>tez.container.max.java.heap.fraction</name>
<value>0.4</value>
</property>
<property>
<name>tez.task.resource.memory.mb</name>
<value>1024</value>
</property>
<property>
<name>tez.task.resource.cpu.vcores</name>
<value>1</value>
</property>
</configuration>
4,设置 hive. 修改hive-site.xml文件
<property>
<name>hive.execution.engine</name>
<value>tez</value>
</property>
<!-- 权限认证的方式,本地测试可以设置为NONE 详情见 (http://lxw1234.com/archives/2016/01/600.htm)
-->
<property>
<name>hive.server2.authentication</name>
<value>NONE</value>
</property>
5,设置客户端的tez. 将 tez-0.9.2.tar.gz 解压到本地 /usr/tez下.
//设置软连接方便日后升级.
$ ln -s tez-0.9.2 default
//配置本地环境变量
export TEZ_CONF_DIR=$HADOOP_CONF_DIR
export TEZ_HOME=/usr/tez/default
export TEZ_JARS=$TEZ_HOME/*:$TEZ_HOME/lib/*
export HADOOP_CLASSPATH=$TEZ_CONF_DIR:$TEZ_JARS:$HADOOP_CLASSPATH
6,重启服务
会有一下一些错误.
错误1 :cause: org.apache.hadoop.service.ServiceStateException: java.lang.NoClassDefFoundError: com/google/common/net/UrlEscapers
2020-06-20T08:57:06,834 INFO [main] org.apache.hadoop.service.AbstractService - Service org.apache.tez.dag.app.DAGAppMaster failed in state STARTED; cause: org.apache.hadoop.service.ServiceStateException: java.lang.NoClassDefFoundError: com/google/common/net/UrlEscapers
org.apache.hadoop.service.ServiceStateException: java.lang.NoClassDefFoundError: com/google/common/net/UrlEscapers
at org.apache.hadoop.service.ServiceStateException.convert(ServiceStateException.java:59) ~[hadoop-common-2.7.7.jar:?]
at org.apache.tez.dag.app.DAGAppMaster.startServices(DAGAppMaster.java:1945) ~[tez-dag-0.9.2.jar:0.9.2]
at org.apache.tez.dag.app.DAGAppMaster.serviceStart(DAGAppMaster.java:2012) ~[tez-dag-0.9.2.jar:0.9.2]
at org.apache.hadoop.service.AbstractService.start(AbstractService.java:193) [hadoop-common-2.7.7.jar:?]
at org.apache.tez.dag.app.DAGAppMaster$9.run(DAGAppMaster.java:2663) [tez-dag-0.9.2.jar:0.9.2]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_91]
at javax.security.auth.Subject.doAs(Subject.java:422) [?:1.8.0_91]
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1762) [hadoop-common-2.7.7.jar:?]
at org.apache.tez.dag.app.DAGAppMaster.initAndStartAppMaster(DAGAppMaster.java:2659) [tez-dag-0.9.2.jar:0.9.2]
at org.apache.tez.dag.app.DAGAppMaster.main(DAGAppMaster.java:2464) [tez-dag-0.9.2.jar:0.9.2]
解决: 我是将$HIVE_HOME/lib中的guava-14.0.1.jar 升级到guava-19.0.jar. 这个问题解决.但是日志中还是有些错误,不影响结果的产出.
image.png
错误2:
使用hive cli 的方式可以正常提交sql.并且在yarn上可以看到作业的运行情况.
但是使用 ** beeline -u jdbc:hive2://localhost:10000 ** .在yarn上看不到作业运行,并且报错. 错误日志只能在hive ui上查看.
解决:
在 local log中的hive.log中查看到是文件权限问题.
beeline 的时候需要指定用户名.
beeline -u jdbc:hive2://localhost:10000 -n wl
网友评论