问题: flink1.8 如何读取待Kerberos认证的hdfs数据?
org.apache.hadoop.security.AccessControlException:
SIMPLE authentication is not enabled. Available:[TOKEN, KERBEROS]
解决办法:
- export HADOOP_CONF_DIR= {core-site.xml 和hdfs-site.xml的位置}
2.配置flink-conf.yml:
security.kerberos.login.use-ticket-cache: false
security.kerberos.login.keytab: <path to keytab>
security.kerberos.login.principal: <principal>
env.java.opts: -Djava.security.krb5.conf=<path to krb5 conf>
-
flink lib目录下添加hadoop相应保本的uber jar https://flink.apache.org/downloads.html
-
flink 工程pom文件中要包含如下依赖
compile "org.apache.flink:flink-java:$flinkVersion"
compile "org.apache.flink:flink-clients_2.11:$flinkVersion"
compile 'org.apache.hadoop:hadoop-hdfs:$hadoopVersion'
compile 'org.apache.hadoop:hadoop-client:$hadoopVersion'
网友评论