Flume学习笔记

作者: 9c0ddf06559c | 来源:发表于2018-01-01 21:42 被阅读66次

官方文档

核心组件

  1. Source 收集

  2. Channel 聚集

  3. Sink 输出

Flume 安装前置条件

  • Java Runtime Environment - Java 1.8 or later
  • Memory - Sufficient memory for configurations used by sources, channels or sinks
  • Disk Space - Sufficient disk space for configurations used by channels or sinks
  • Directory Permissions - Read/Write permissions for directories used by agent

安装

  1. 安装jdk
  2. 下载并解压到用户目录
  3. 配置环境变量
    export FLUME_HOME="/Users/gaowenfeng/Documents/bigdata/flume"
    export PATH=$FLUME_HOME/bin:$PATH
    
  4. source 下让其生效
  5. flume-env.sh 的配置
export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_91.jdk/Contents/Home
  1. 检测 $FLUME_HOME/bin/flume-ng version

需求1

使用flume的关键就是写配置文件

  1. 配置Source
  2. 配置Channel
  3. 配置Sink
  4. 把以上三个组件穿起来
# example.conf: A single-node Flume configuration

# a1 agent 的名称
# r1 surce 的名称
# k1 sink 的名称
# c1 channel 的名称

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = netcat
a1.sources.r1.bind = localhost
a1.sources.r1.port = 44444

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory


# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

启动Agent

flume-ng agent \
--name a1 \
--conf conf $FLUME_HOME/conf \
--conf-file $FLUME_HOME/conf/example.conf \
-Dflume.root.logger=INFO,console

使用telnet进行测试

telnet host ip
Event: { headers:{} body: 68 65 6C 6C 6F 0D                               hello. }
Event 是Flume 数据传输的基本单元
Event = 可选的header+byte array

需求2

Agent 选型:exec source +memory channel+logger sink

exex-memory-logger.conf


# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /Users/gaowenfeng/data/data.log
a1.sources.r1.shell = /bin/sh -c

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory


# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

启动agent

flume-ng agent \
--name a1 \
--conf conf $FLUME_HOME/conf \
--conf-file $FLUME_HOME/conf/exex-memory-logger.conf \
-Dflume.root.logger=INFO,console

需求3

技术选型:

    exec source + memory channel + avro sink
    avro source + memory channel + logger sink

exec-memory-avro.conf


# Name the components on this agent
exec-memory-avro.sources = exec-source
exec-memory-avro.sinks = avro-sink
exec-memory-avro.channels = memory-channel

# Describe/configure the source
exec-memory-avro.sources.exec-source.type = exec
exec-memory-avro.sources.exec-source.command = tail -F /Users/gaowenfeng/data/data.log
exec-memory-avro.sources.exec-source.shell = /bin/sh -c

# Describe the sink
exec-memory-avro.sinks.avro-sink.type = avro
exec-memory-avro.sinks.avro-sink.hostname = localhost
exec-memory-avro.sinks.avro-sink.port = 44444

# Use a channel which buffers events in memory
exec-memory-avro.channels.memory-channel.type = memory


# Bind the source and sink to the channel
exec-memory-avro.sources.exec-source.channels = memory-channel
exec-memory-avro.sinks.avro-sink.channel = memory-channel

avro-memory-logger.conf


# Name the components on this agent
avro-memory-logger.sources = avro-source
avro-memory-logger.sinks = logger-sink
avro-memory-logger.channels = memory-channel

# Describe/configure the source
avro-memory-logger.sources.avro-source.type = avro
avro-memory-logger.sources.avro-source.bind = localhost
avro-memory-logger.sources.avro-source.port = 44444

# Describe the sink
avro-memory-logger.sinks.logger-sink.type = logger

# Use a channel which buffers events in memory
avro-memory-logger.channels.memory-channel.type = memory


# Bind the source and sink to the channel
avro-memory-logger.sources.avro-source.channels = memory-channel
avro-memory-logger.sinks.logger-sink.channel = memory-channel

先启动agent

flume-ng agent \
--name avro-memory-logger \
--conf conf $FLUME_HOME/conf \
--conf-file $FLUME_HOME/conf/avro-memory-logger.conf \
-Dflume.root.logger=INFO,console

再启动

flume-ng agent \
--name exec-memory-avro \
--conf conf $FLUME_HOME/conf \
--conf-file $FLUME_HOME/conf/exec-memory-avro.conf \
-Dflume.root.logger=INFO,console

日志收集过程

  1. 机器A上哪个监控一个文件,当我们访问主站的时候会有用户行为日志记录到access.log中
  2. avro sin把新产生的日志输出到对应的avro source指定的hostname port 中
  3. 通过avro source对应的agent将日志输出到对应的控制台[kafaka]

相关文章

  • Flume学习笔记

    官方文档 核心组件 Source 收集 Channel 聚集 Sink 输出 Flume 安装前置条件 Java ...

  • Flume学习笔记

    本文是对大数据组件Flume的一个学习总结,共包括如下章节的内容: 简介 核心概念 使用场景 快速起步 小结 一、...

  • flume学习笔记

    0. 安装 下载 wget http://apache.fayea.com/flume/1.7.0/apache-...

  • Hadoop学习笔记(3)-Flume

    自行整理, 学习用途, 侵知删歉Flume的设计目标: 可靠性, 可量测性, 可扩展性 Agent将数据写成多种H...

  • 大数据开发学习笔记——flume集成hive

    又是周五啦~ 分享完就回家过周末~_~ flume集成hive的笔记 1、 确定你的flume在哪台主机上 2、 ...

  • flume笔记

    最近要准备做一个系统监控小项目,简要分析了需求之后决定用当下比较火的flume-kafka-spark-sprin...

  • flume笔记

    flume 安装后基本的测试验证: 配置 source 使用 necat 类型,sink 采用 file_roll...

  • flume笔记

    Flume是一个高可用的,高可靠的,分布式的海量日志采集、聚合和传输的系统。 一、主要组件: 二、flume使用案例

  • Flume学习

    1. Flume架构 https://blog.csdn.net/a2011480169/article/deta...

  • flume学习与总结记录

    1.什么是flume Cloudera 开发的框架,实时收集数据 Flume学习的核心: agent的设计 官方文...

网友评论

    本文标题:Flume学习笔记

    本文链接:https://www.haomeiwen.com/subject/alrxnxtx.html