美文网首页
hdfs集成Kerberos

hdfs集成Kerberos

作者: xuefly | 来源:发表于2018-03-31 09:25 被阅读758次

    隶属于文章系列:大数据安全实战 https://www.jianshu.com/p/76627fd8399c


    步骤

    1. 创建principle
    2. 修改core-site.xml
    3. 修改hdfs-site.xml
    4. 配置HTTPS

    hdfs集成Kerberos

      <!-- 2018-3-9 Kerberos  -->
      <property>
      <name>hadoop.security.authentication</name>
      <value>kerberos</value>
    </property>
    
    <property>
      <name>hadoop.security.authorization</name>
      <value>true</value>
    </property>
    
    <property>
      <name>dfs.block.access.token.enable</name>
      <value>true</value>
    </property>
    <property>
      <name>dfs.datanode.data.dir.perm</name>
      <value>700</value>
    </property>
    
    <property>
      <name>dfs.namenode.keytab.file</name>
      <value>/etc/hadoop/conf/hdfs-service.keytab</value>
    </property>
    
    <property>
      <name>dfs.namenode.kerberos.principal</name>
      <value>hdfs/_HOST@TT.COM</value>
    </property>
    <property>
      <name>dfs.namenode.kerberos.https.principal</name>
      <value>HTTP/_HOST@TT.COM</value>
    </property>
    
    
    <!--  
          <property>
      <name>dfs.datanode.address</name>
      <value>0.0.0.0:1004</value>
    </property>
    <property>
      <name>dfs.datanode.http.address</name>
      <value>0.0.0.0:1006</value>
    </property>
    
     -->
    <property>
      <name>dfs.datanode.address</name>
      <value>0.0.0.0:61004</value>
    </property>
    <property>
      <name>dfs.datanode.http.address</name>
      <value>0.0.0.0:61006</value>
    </property>
    
    <property>
      <name>dfs.http.policy</name>
      <value>HTTPS_ONLY</value>
    </property>
    
    <property>
      <name>dfs.data.transfer.protection</name>
      <value>integrity</value>
    </property>
    
    
    
    
    <property>
      <name>dfs.datanode.keytab.file</name>
      <value>/etc/hadoop/conf/hdfs-service.keytab</value>
    </property>
    <property>
      <name>dfs.datanode.kerberos.principal</name>
      <value>hdfs/_HOST@TT.COM</value>
    </property>
    <property>
      <name>dfs.datanode.kerberos.https.principal</name>
      <value>HTTP/_HOST@TT.COM</value>
    </property>
    
    
    
    <property>
      <name>dfs.journalnode.keytab.file</name>
      <value>/etc/hadoop/conf/hdfs-service.keytab</value>
    </property>
    <property>
      <name>dfs.journalnode.kerberos.principal</name>
      <value>hdfs/_HOST@TT.COM</value>
    </property>
    <property>
      <name>dfs.journalnode.kerberos.internal.spnego.principal</name>
      <value>HTTP/_HOST@TT.COM</value>
    </property>
    <property>
      <name>dfs.webhdfs.enabled</name>
      <value>true</value>
    </property>
    <property>
      <name>dfs.web.authentication.kerberos.principal</name>
      <value>HTTP/_HOST@TT.COM</value>
    </property>
    
    <property>
      <name>dfs.web.authentication.kerberos.keytab</name>
      <value>/etc/hadoop/conf/hdfs-service.keytab</value>
    </property>
    
    
    • 在配置完上面的配置文件,启动后报如下错误
    java.lang.RuntimeException: Cannot start secure DataNode without configuring either privileged resources or SASL RPC data transfer protection and SSL for HTTP.  Using privileged resources in combination with SASL RPC data transfer protection is not supported.
            at org.apache.hadoop.hdfs.server.datanode.DataNode.checkSecureConfig(DataNode.java:1201)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1101)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:429)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2406)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2293)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2340)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2522)
            at org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2546)
    2018-03-13 14:01:27,317 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
    2018-03-13 14:01:27,318 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: SHUTDOWN_MSG:
    

    Using privileged resources in combination with SASL RPC data transfer protection is not supported.显示privileged resources(即小端口号)和SASL RPC data transfer protection不能同时使用。
    这时候有两条道路选择:

    <!--  
          <property>
      <name>dfs.datanode.address</name>
      <value>0.0.0.0:1004</value>
    </property>
    <property>
      <name>dfs.datanode.http.address</name>
      <value>0.0.0.0:1006</value>
    </property>
    
     -->
    <property>
      <name>dfs.datanode.address</name>
      <value>0.0.0.0:61004</value>
    </property>
    <property>
      <name>dfs.datanode.http.address</name>
      <value>0.0.0.0:61006</value>
    </property>
    
    

    这时候又报错:

    2018-03-09 20:44:10,993 INFO org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler: Login using keytab /etc/hadoop/conf/hdfs-service.keytab, for principal HTTP/v-hadoop-kbds.sz.kingdee.net@TT.COM
    2018-03-09 20:44:11,000 INFO org.apache.hadoop.security.authentication.server.KerberosAuthenticationHandler: Login using keytab /etc/hadoop/conf/hdfs-service.keytab, for principal HTTP/v-hadoop-kbds.sz.kingdee.net@TT.COM
    2018-03-09 20:44:11,003 WARN org.mortbay.log: failed SslSelectChannelConnectorSecure@0.0.0.0:50470: java.io.FileNotFoundException: /home/kduser/.keystore (No such file or directory)
    2018-03-09 20:44:11,003 WARN org.mortbay.log: failed Server@10ded6a9: java.io.FileNotFoundException: /home/kduser/.keystore (No such file or directory)
    2018-03-09 20:44:11,003 INFO org.apache.hadoop.http.HttpServer2: HttpServer.start() threw a non Bind IOException
    java.io.FileNotFoundException: /home/kduser/.keystore (No such file or directory)
            at java.io.FileInputStream.open0(Native Method)
            at java.io.FileInputStream.open(FileInputStream.java:195)
            at java.io.FileInputStream.<init>(FileInputStream.java:138)
            at org.mortbay.resource.FileResource.getInputStream(FileResource.java:275)
            at org.mortbay.jetty.security.SslSelectChannelConnector.createSSLContext(SslSelectChannelConnector.java:624)
            at org.mortbay.jetty.security.SslSelectChannelConnector.doStart(SslSelectChannelConnector.java:598)
            at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
            at org.mortbay.jetty.Server.doStart(Server.java:235)
            at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
            at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:877)
            at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:760)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:819)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:803)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1500)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1566)
    2018-03-09 20:44:11,006 INFO org.mortbay.log: Stopped SslSelectChannelConnectorSecure@0.0.0.0:50470
    2018-03-09 20:44:11,107 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: Stopping NameNode metrics system...
    2018-03-09 20:44:11,108 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system stopped.
    2018-03-09 20:44:11,108 INFO org.apache.hadoop.metrics2.impl.MetricsSystemImpl: NameNode metrics system shutdown complete.
    2018-03-09 20:44:11,108 ERROR org.apache.hadoop.hdfs.server.namenode.NameNode: Failed to start namenode.
    java.io.FileNotFoundException: /home/kduser/.keystore (No such file or directory)
            at java.io.FileInputStream.open0(Native Method)
            at java.io.FileInputStream.open(FileInputStream.java:195)
            at java.io.FileInputStream.<init>(FileInputStream.java:138)
            at org.mortbay.resource.FileResource.getInputStream(FileResource.java:275)
            at org.mortbay.jetty.security.SslSelectChannelConnector.createSSLContext(SslSelectChannelConnector.java:624)
            at org.mortbay.jetty.security.SslSelectChannelConnector.doStart(SslSelectChannelConnector.java:598)
            at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
            at org.mortbay.jetty.Server.doStart(Server.java:235)
            at org.mortbay.component.AbstractLifeCycle.start(AbstractLifeCycle.java:50)
            at org.apache.hadoop.http.HttpServer2.start(HttpServer2.java:877)
            at org.apache.hadoop.hdfs.server.namenode.NameNodeHttpServer.start(NameNodeHttpServer.java:142)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.startHttpServer(NameNode.java:760)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.initialize(NameNode.java:639)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:819)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.<init>(NameNode.java:803)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.createNameNode(NameNode.java:1500)
            at org.apache.hadoop.hdfs.server.namenode.NameNode.main(NameNode.java:1566)
    2018-03-09 20:44:11,110 INFO org.apache.hadoop.util.ExitUtil: Exiting with status 1
    2018-03-09 20:44:11,111 INFO org.apache.hadoop.hdfs.server.namenode.NameNode: SHUTDOWN_MSG:
    /************************************************************
    SHUTDOWN_MSG: Shutting down NameNode at v-hadoop-kbds.sz.kingdee.net/172.20.178.28
    ************************************************************/
    

    这需要配置HTTPS。
    参考:HDFS的HTTPS配置

    相关文章

      网友评论

          本文标题:hdfs集成Kerberos

          本文链接:https://www.haomeiwen.com/subject/cnzycftx.html