美文网首页
CDH 组件配置

CDH 组件配置

作者: Grey____ | 来源:发表于2019-03-19 16:51 被阅读0次
kafka外网访问配置
  1. 进入kafka
  2. 点击配置
  3. 搜索kafka.properties 的 Kafka Broker 高级配置代码段(安全阀)
  4. 输入:
#listeners需要监听所有来源。
listeners=PLAINTEXT://:9092
#此处配置hadoop-slave2主要是内外网络隔离,无法联通,通过本地hosts配置进行连接。
advertised.listeners=PLAINTEXT://hadoop-slave2:9092
spark2-shell报错

参考:spark-shell报错

Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/fs/FSDataInputStream
    at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:117)
    at org.apache.spark.deploy.SparkSubmitArguments$$anonfun$mergeDefaultSparkProperties$1.apply(SparkSubmitArguments.scala:117)
    at scala.Option.getOrElse(Option.scala:120)
    at org.apache.spark.deploy.SparkSubmitArguments.mergeDefaultSparkProperties(SparkSubmitArguments.scala:117)
    at org.apache.spark.deploy.SparkSubmitArguments.<init>(SparkSubmitArguments.scala:103)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:114)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.fs.FSDataInputStream
    at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    ... 7 more
  • 本地Linux解决:执行export export SPARK_DIST_CLASSPATH=$(${HADOOP_HOME}/bin/hadoop classpath)(CDH的HADOOP_HOME为 /opt/cloudera/parcels/CDH/lib/hadoop/bin/hadoop
  • 远端解决(Azkaban上执行还是报错)
    1. 修改CDH-SPARK的spark2-conf/spark-env.sh 的 Spark 2 服务高级配置代码段(安全阀)配置,追加export SPARK_DIST_CLASSPATH=$(/opt/cloudera/parcels/CDH/lib/hadoop/bin/hadoop classpath)
    2. 重启,发现报错:
    Wed Mar 20 10:48:34 CST 2019
    using  as JAVA_HOME
    using 5 as CDH_VERSION
    using /opt/cm-5.15.1/run/cloudera-scm-agent/process/ccdeploy_spark2- 
    conf_etcspark2conf.cloudera.spark2_on_yarn_1787738555099046011 as CONF_DIR
    using spark2-conf as DIRECTORY_NAME
    using /etc/spark2/conf.cloudera.spark2_on_yarn as DEST_PATH
    using spark2-conf as ALT_NAME
    using /etc/spark2/conf as ALT_LINK
    using 51 as PRIORITY
    using scripts/control.sh as RUNNER_PROGRAM
    using client as RUNNER_ARGS
    using /usr/sbin/update-alternatives as UPDATE_ALTERNATIVES
    Deploying service client configs to /etc/spark2/conf.cloudera.spark2_on_yarn
    invoking optional deploy script scripts/control.sh
    /opt/cm-5.15.1/run/cloudera-scm-agent/process/ccdeploy_spark2-    conf_etcspark2conf.cloudera.spark2_on_yarn_1787738555099046011/spark2-conf /opt/cm-    5.15.1/run/cloudera-scm-agent/process/ccdeploy_spark2-  conf_etcspark2conf.cloudera.spark2_on_yarn_1787738555099046011
    Wed Mar 20 10:48:35 CST 2019: Running Spark2 CSD control script...
    Wed Mar 20 10:48:35 CST 2019: Detected CDH_VERSION of [5]
    Java version 1.8 is required for Spark 2.2.
    
    1. 修改CDH的JAVA_HOME即可,方法如下有介绍
修改CDH的JAVA_HOME方法

参考:修改CDH的JAVA_HOME方法
主机——所有主机——点击任意一主机——配置——修改Java 主目录为对应的JAVA_HOME目录——重启

相关文章

网友评论

      本文标题:CDH 组件配置

      本文链接:https://www.haomeiwen.com/subject/uzrnmqtx.html