Spark Streaming實時流處理筆記(3)——日誌採集Flume
阿新 • • 發佈:2018-12-06
1 Flume介紹
1.1 設計目標
- 可靠性
- 擴充套件性
- 管理性
1.2 同類產品
- Flume: Cloudera/Apache,Java
- Scribe: Facebook ,C/C++(不維護了)
- Chukwa: Yahoo/Apache,Java
- Fluentd: Ruby
- Logstash:ELK(ElasticSearch,Kibana)
1.3 Flume發展史
- Cloudera ,0.9.2,Flume-OG
- Apache,flume-728,flume-NG
1.4 Event
Event是Flume中傳輸的基本資料單元,
Event = 可選header + byte array
2 Flume 架構及核心元件
- Source ,收集
- Channel, 聚集
- Sink,輸出
2 Flume環境部署
2.1 配置JDK
2.2 下載 Flume
http://archive-primary.cloudera.com/cdh5/cdh/5/
[[email protected] ~]$ ll
total 66760
drwxrwxr-x. 15 hadoop hadoop 4096 Nov 1 08:52 apps
drwxrwxr-x. 4 hadoop hadoop 30 Oct 25 21:59 elasticsearchData
-rw-r--r--. 1 hadoop hadoop 67321333 Dec 3 19:37 flume-ng-1.6.0-cdh5.7.0.tar.gz
drwxrwxr-x. 4 hadoop hadoop 28 Sep 14 19:02 hbase
drwxrwxr-x. 4 hadoop hadoop 32 Sep 14 14:44 hdfsdir
drwxrwxrwx. 3 hadoop hadoop 26 Oct 30 16:53 hdp2.6-cdh5.7-data
drwxrwxrwx. 3 hadoop hadoop 18 Oct 24 21:45 kafkaData
drwxrwxr-x. 5 hadoop hadoop 133 Oct 23 14:40 metastore_db
-rw-r--r--. 1 hadoop hadoop 999635 Aug 29 2017 mysql-connector-java-5.1.44-bin.jar
drwxr-xr-x. 30 hadoop hadoop 4096 Dec 2 18:40 spark-2.2.0
drwxrwxr-x. 3 hadoop hadoop 63 Oct 24 21:21 zookeeperData
-rw-rw-r--. 1 hadoop hadoop 26108 Oct 25 17:54 zookeeper.out
[ [email protected] ~]$ pwd
/home/hadoop
[[email protected] ~]$ tar -zxvf flume-ng-1.6.0-cdh5.7.0.tar.gz -C /home/hadoop/apps
2.3 把 flume 配置到環境變數
vim /ect/profile
export FLUME_HOME=/home/hadoop/apps/apache-flume-1.6.0-cdh5.7.0-bin
export PATH=$PATH:$FLUME_HOME/bin
2.4 配置 flume-env.sh
[[email protected] ~]$ cd $FLUME_HOME
[[email protected] apache-flume-1.6.0-cdh5.7.0-bin]$ cd conf
[[email protected] conf]$ ll
total 16
-rw-r--r--. 1 hadoop hadoop 1661 Mar 24 2016 flume-conf.properties.template
-rw-r--r--. 1 hadoop hadoop 1110 Mar 24 2016 flume-env.ps1.template
-rw-r--r--. 1 hadoop hadoop 1214 Mar 24 2016 flume-env.sh.template
-rw-r--r--. 1 hadoop hadoop 3107 Mar 24 2016 log4j.properties
[[email protected] conf]$ cp flume-env.sh.template flume-env.sh
[[email protected] conf]$ ll
total 20
-rw-r--r--. 1 hadoop hadoop 1661 Mar 24 2016 flume-conf.properties.template
-rw-r--r--. 1 hadoop hadoop 1110 Mar 24 2016 flume-env.ps1.template
-rw-r--r--. 1 hadoop hadoop 1214 Dec 3 19:48 flume-env.sh
-rw-r--r--. 1 hadoop hadoop 1214 Mar 24 2016 flume-env.sh.template
-rw-r--r--. 1 hadoop hadoop 3107 Mar 24 2016 log4j.properties
[[email protected] conf]$
配置裡面的 JAVA_HOME
3 Flume 測試案例
3.1 從指定網路埠採集資料到控制檯
- 編寫配置檔案
參考https://flume.apache.org/FlumeUserGuide.html
[[email protected] conf]$ pwd
/home/hadoop/apps/apache-flume-1.6.0-cdh5.7.0-bin/conf
[[email protected] conf]$ mkdir myconf
[[email protected] conf]$ cd myconf/
[[email protected] myconf]$ vim logger.conf
# example.conf: A single-node Flume configuration
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = netcat
a1.sources.r1.bind = node1
a1.sources.r1.port = 44444
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
- 啟動 agent
flume-ng agent --name a1 --conf $FLUME_HOME/conf --conf-file $FLUME_HOME/conf/myconf/logger.conf -Dflume.root.logger=INFO,console
- 測試
3.2 監控一個檔案實時採集新增的資料輸出到控制檯
- Agent選型
exec source + memory channel + logger sink
- 新建配置檔案
[[email protected] myconf]$ vim exec-memory-logger.conf
# example.conf: A single-node Flume configuration
# Name the components on this agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1
# Describe/configure the source
a1.sources.r1.type = exec
a1.sources.r1.command = tail -F /home/hadoop/tempdata/data.log
a1.sources.r1.shell = /bin/sh -c
# Describe the sink
a1.sinks.k1.type = logger
# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100
# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
- 啟動客戶端
flume-ng agent --name a1 --conf $FLUME_HOME/conf --conf-file $FLUME_HOME/conf/myconf/exec-memory-logger.conf -Dflume.root.logger=INFO,console
3.3 將A伺服器上的日誌實時採集到B伺服器
3.3.1 技術選型
- 第一組:
exec source + memory channel +avro sink
- 第二組 :
avro source + memory channel + logger sink
3.3.2 配置檔案
exec-memory-avro.conf
# Name the components on this agent
exec-memory-avro.sources = exec-source
exec-memory-avro.sinks = avro-sink
exec-memory-avro.channels = memory-channel
# Describe/configure the source
exec-memory-avro.sources.exec-source.type = exec
exec-memory-avro.sources.exec-source.command = tail -F /home/hadoop/tempdata/data.log
exec-memory-avro.sources.exec-source.shell = /bin/sh -c
# Describe the sink
exec-memory-avro.sinks.avro-sink.type = avro
exec-memory-avro.sinks.avro-sink.hostname = node1
exec-memory-avro.sinks.avro-sink.port = 44444
# Use a channel which buffers events in memory
exec-memory-avro.channels.memory-channel.type = memory
# Bind the source and sink to the channel
exec-memory-avro.sources.exec-source.channels = memory-channel
exec-memory-avro.sinks.avro-sink.channel = memory-channel
avro-memory-logger.conf
# Name the components on this agent
avro-memory-logger.sources = avro-source
avro-memory-logger.sinks = logger-sink
avro-memory-logger.channels = memory-channel
# Describe/configure the source
avro-memory-logger.sources.avro-source.type = avro
avro-memory-logger.sources.avro-source.bind = node1
avro-memory-logger.sources.avro-source.port = 44444
# Describe the sink
avro-memory-logger.sinks.logger-sink.type = logger
# Use a channel which buffers events in memory
avro-memory-logger.channels.memory-channel.type = memory
# Bind the source and sink to the channel
avro-memory-logger.sources.avro-source.channels = memory-channel
avro-memory-logger.sinks.logger-sink.channel = memory-channel
3.3.3 啟動客戶端
先啟動
flume-ng agent --name avro-memory-logger --conf $FLUME_HOME/conf --conf-file $FLUME_HOME/conf/myconf/avro-memory-logger.conf -Dflume.root.logger=INFO,console
再啟動
flume-ng agent --name exec-memory-avro --conf $FLUME_HOME/conf --conf-file $FLUME_HOME/conf/myconf/exec-memory-avro.conf -Dflume.root.logger=INFO,console