美文网首页
Ambari2.7.5集成Flink1.13.6和FlinkCD

Ambari2.7.5集成Flink1.13.6和FlinkCD

作者: 李民_ | 来源:发表于2022-12-13 11:37 被阅读0次

    Flink 安装

    基本安装文档参考

    https://blog.csdn.net/qq_36048223/article/details/116114765

    异常一:parent directory /opt/flink/conf doesn't exist

    不知啥原因,没解压过来,直接手动解压到该目录

    tar -zxvf flink-1.13.2-bin-scala_2.11.tgz -C /opt/flink

    cd /opt/flink

    mv flink-1.13.2/* /opt/flink

    异常二:Sum of configured JVM Metaspace (256.000mb (268435456 bytes)) and JVM Overhead (192.000mb (201326592 bytes)) exceed configured Total Process Memory (256.000mb (268435456 bytes)).

    https://blog.csdn.net/NDF923/article/details/123730372

    集成 flink-cdc for hive

    https://juejin.cn/post/7176084265161982008

    https://nightlies.apache.org/flink/flink-docs-release-1.13/zh/docs/connectors/table/hive/overview/

    异常三:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V

    看我的另一篇文档 :https://www.jianshu.com/p/e6a76d8422d4

    • 1、删除flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar中的com.google.common.base.Preconditions.class

    • 2、修改guava-28.0的源码,在Preconditions.jar中增加

    public static void checkArgument(String errorMessageTemplate, @Nullable Object p1) {
    
     throw new IllegalArgumentException(*lenientFormat*(errorMessageTemplate, p1));
    
    }
    
    public static void checkArgument(
    
     @Nullable String errorMessageTemplate,
    
     Object @Nullable ... errorMessageArgs) {
    
     throw new IllegalArgumentException(*lenientFormat*(errorMessageTemplate, errorMessageArgs));
    
    }
    
    • 3、把编译后的com.google.common.base.Preconditions.class 替换到flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar中

    • 4、替换的办法

    解压:建一个空目录,把jar包放进去,使用jar xvf flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar 解压。解压后注意手动删这个jar包本身

    压缩:进入上面的目录执行 jar cvf flink-sql-connector-hive-3.1.2_2.11-1.13.6-update.jar .

    异常四:[ERROR] Could not execute SQL statement. Reason:

    java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration

    执行:

    export HADOOP_CLASSPATH=`hadoop classpath`
    

    异常五:Exception: Connection refused: localhost/127.0.0.1:8081

    停止flink的yarn-session模式,启动

    异常六:flink sql clien shell无法提交sql到yarn运行。

    编辑:/var/lib/ambari-server/resources/stacks/HDP/{HDP_VERSION}/services/FLINK/package/scripts/flink.py

    若已经安装还需要修改:/var/lib/ambari-agent/cache/stacks/HDP/{HDP_VERSION}/services/FLINK/package/scripts/flink.py

    增加 --detached 参数即可

    异常七:[ERROR] Could not execute SQL statement. Reason:

    java.lang.RuntimeException: The Yarn application application_XXXX doesn't run anymore.

    请注意:如果是用flink用户(ambari默认)启动的yarn-session 则sql-client.sh也要在flink下才能正常提交sql到yarn用运行

    最后

    最好把

    export HADOOP_CLASSPATH=`hadoop classpath`
    

    写到/etc/profile中。这样运行sql-client会少很多麻烦

    相关文章

      网友评论

          本文标题:Ambari2.7.5集成Flink1.13.6和FlinkCD

          本文链接:https://www.haomeiwen.com/subject/aqshqdtx.html