美文网首页
flume读取日志文件数据并写入kafka

flume读取日志文件数据并写入kafka

作者: VIAE | 来源:发表于2019-02-02 14:24 被阅读0次

进入flume目录D:\apache-flume-1.9.0-bin\conf,创建配置文件kafka.conf

# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.spoolDir = /test/flumeSpool
a1.sources.r1.fileHeader = true

# Describe the sink
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.topic = test
a1.sinks.k1.kafka.bootstrap.servers = localhost:9092
a1.sinks.k1.kafka.flumeBatchSize = 20
a1.sinks.k1.kafka.producer.acks = 1
a1.sinks.k1.kafka.producer.linger.ms = 1
a1.sinks.k1.kafka.producer.compression.type = snappy

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

保存配置文件,在D:\apache-flume-1.9.0-bin\conf打开命令窗口,输入

flume-ng agent --conf ../conf --conf-file kafka.conf --name a1

启动zookeeper环境以后,启动kafka

.\bin\windows\kafka-server-start.bat .\config\server.properties

启动kafka的消费者

 .\bin\windows\kafka-console-consumer.bat --bootstrap-server localhost:9092 --topic test --from-beginning

这里的topic与D:\apache-flume-1.9.0-bin\conf下的配置文件kafka.conf里的sink组件中的topic一致

我们可以在D:\kafka_2.11-1.0.0\config下的server.properties文件内找到log.dirs,log.dirs对应的值就是kafka写入文件所在的位置

相关文章

网友评论

      本文标题:flume读取日志文件数据并写入kafka

      本文链接:https://www.haomeiwen.com/subject/tlqssqtx.html