美文网首页
Flink入门-日志配置

Flink入门-日志配置

作者: zfylin | 来源:发表于2020-09-02 16:39 被阅读0次

    前言

    本文介绍的内存配置方法基于Flink 1.10

    配置log4j

    Flink1.10 使用的默认日志是 Log4j,配置文件的如下:

    • log4j-cli.properties: 由Flink命令行客户端使用(例如flink run)
    • log4j-yarn-session.properties: 由Flink命令行启动YARN Session(yarn-session.sh)时使用
    • log4j.properties: JobManager / Taskmanager日志(包括standalone和YARN)

    默认配置

    log4j.properties 内容如下:

    # This affects logging for both user code and Flink
    log4j.rootLogger=INFO, file
    
    # Uncomment this if you want to _only_ change Flink's logging
    #log4j.logger.org.apache.flink=INFO
    
    # The following lines keep the log level of common libraries/connectors on
    # log level INFO. The root logger does not override this. You have to manually
    # change the log levels here.
    log4j.logger.akka=INFO
    log4j.logger.org.apache.kafka=INFO
    log4j.logger.org.apache.hadoop=INFO
    log4j.logger.org.apache.zookeeper=INFO
    
    # Log all infos in the given file
    log4j.appender.file=org.apache.log4j.FileAppender
    log4j.appender.file.file=${log.file}
    log4j.appender.file.append=false
    log4j.appender.file.layout=org.apache.log4j.PatternLayout
    log4j.appender.file.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %-5p %-60c %x - %m%n
    
    
    # Suppress the irrelevant (wrong) warnings from the Netty channel handler
    log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, file
    

    滚动配置

    默认配置文件会将 JobManager 和 TaskManager 的日志分别打印在不同的文件中,每个文件的日志大小一直会增加.

    生产环境建议将日志文件配置成按大小滚动生成,配置文件如下:

    # This affects logging for both user code and Flink
    log4j.rootLogger=INFO, R
     
    # Uncomment this if you want to _only_ change Flink's logging
    #log4j.logger.org.apache.flink=INFO
     
    # The following lines keep the log level of common libraries/connectors on
    # log level INFO. The root logger does not override this. You have to manually
    # change the log levels here.
    log4j.logger.akka=INFO
    log4j.logger.org.apache.kafka=INFO
    log4j.logger.org.apache.hadoop=INFO
    log4j.logger.org.apache.zookeeper=INFO
     
     
    log4j.appender.R=org.apache.log4j.RollingFileAppender
    log4j.appender.R.File=${log.file}
    log4j.appender.R.MaxFileSize=256MB
    log4j.appender.R.Append=true
    log4j.appender.R.MaxBackupIndex=10
    log4j.appender.R.layout=org.apache.log4j.PatternLayout
    log4j.appender.R.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %t %-5p %-60c %x - %m%n
     
    # Suppress the irrelevant (wrong) warnings from the Netty channel handler
    log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, R
    

    Kafka配置

    有的时候需要将日志发送到kafka做一些监控告警或者统一采集到ELK查看分析, 则可以使用KafkaLog4jAppender发送到kafka, 配置文件如下:

    # This affects logging for both user code and Flink
    log4j.rootLogger=INFO, kafka
    
    
    # Uncomment this if you want to _only_ change Flink's logging
    #log4j.logger.org.apache.flink=INFO
    
    # The following lines keep the log level of common libraries/connectors on
    # log level INFO. The root logger does not override this. You have to manually
    # change the log levels here.
    # !!!这里的配置要加上kafka,否则会卡在kafka send!!!
    log4j.logger.akka=INFO, kafka
    log4j.logger.org.apache.kafka=INFO, kafka
    log4j.logger.org.apache.hadoop=INFO, kafka
    log4j.logger.org.apache.zookeeper=INFO, kafka
    
    
    # log send to kafka
    log4j.appender.kafka=org.apache.kafka.log4jappender.KafkaLog4jAppender
    log4j.appender.kafka.brokerList=localhost:9092
    log4j.appender.kafka.topic=flink_logs
    log4j.appender.kafka.compressionType=none
    log4j.appender.kafka.requiredNumAcks=0
    log4j.appender.kafka.syncSend=false
    log4j.appender.kafka.layout=org.apache.log4j.PatternLayout
    log4j.appender.kafka.layout.ConversionPattern=[frex] [%d{yyyy-MM-dd HH:mm:ss,SSS}] [%p] %c{1}:%L %x - %m%n
    log4j.appender.kafka.level=INFO
    
    
    # Suppress the irrelevant (wrong) warnings from the Netty channel handler
    log4j.logger.org.apache.flink.shaded.akka.org.jboss.netty.channel.DefaultChannelPipeline=ERROR, kafka
    

    同时需要把 kafka-log4j-appender包放到 ${FLINK_HOME}/lib下

    配置logback

    待续.

    参考链接

    Flink Logging

    相关文章

      网友评论

          本文标题:Flink入门-日志配置

          本文链接:https://www.haomeiwen.com/subject/frywsktx.html