教你一招:用flume把数据抽到kafka

创建一个消费者,开启消费:(消费的是test02的数据,可更改自行创建)
教你一招:用flume把数据抽到kafka
开启水管flume,(监听temData.tsv)
教你一招:用flume把数据抽到kafka
这里有一个flume的配置文件(job是我创建在flume目录下的文件夹,job下有flume-kafka.conf的文件)
教你一招:用flume把数据抽到kafka
向temData.tsv中写入数据(这里写了个jar包,不停向temData.tsv写入数据)
教你一招:用flume把数据抽到kafka

在这里插入代码片
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /home/hadoop/data/temData.tsv

# Describe the sink
#a1.sinks.k1.type = logger
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.topic = test02
a1.sinks.k1.brokerList = 192.168.57.138:9092
a1.sinks.k1.requiredAcks = 1
a1.sinks.k1.batchSize = 20

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1