美文网首页
flink 问题总结(5)如何读取Kerberos认证的hado

flink 问题总结(5)如何读取Kerberos认证的hado

作者: ZYvette | 来源:发表于2020-05-11 19:44 被阅读0次

问题: flink1.8 如何读取待Kerberos认证的hdfs数据?

org.apache.hadoop.security.AccessControlException: 
SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS] 

解决办法:

  1. export HADOOP_CONF_DIR= {core-site.xml 和hdfs-site.xml的位置}

2.配置flink-conf.yml:

security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: <path to keytab>
security.kerberos.login.principal: <principal>
env.java.opts: -Djava.security.krb5.conf=<path to krb5 conf>

  1. flink lib目录下添加hadoop相应保本的uber jar https://flink.apache.org/downloads.html

  2. flink 工程pom文件中要包含如下依赖

compile "org.apache.flink:flink-java:$flinkVersion"
compile "org.apache.flink:flink-clients_2.11:$flinkVersion"
compile 'org.apache.hadoop:hadoop-hdfs:$hadoopVersion'
compile 'org.apache.hadoop:hadoop-client:$hadoopVersion'

参考链接:

https://stackoverflow.com/questions/34596165/how-to-do-kerberos-authentication-on-a-flink-standalone-installation

相关文章

网友评论

      本文标题:flink 问题总结(5)如何读取Kerberos认证的hado

      本文链接:https://www.haomeiwen.com/subject/npepnhtx.html