您的位置:首页 > 运维架构 > 网站架构

102-整合log4j和flume架构

2015-10-08 22:50 671 查看
1、安装flume
1.1 版本: 1.5.2

1.2 修改配置文件flume-conf.properties
[hadoop@mycluster conf]$ vi flume-conf.properties

agent1.channels = ch1
agent1.sources = avro-source1
agent1.sinks = log-sink1

# 定义channel
agent1.channels.ch1.type = memory

# 定义source
agent1.sources.avro-source1.channels = ch1
agent1.sources.avro-source1.type = avro
agent1.sources.avro-source1.bind = 0.0.0.0
agent1.sources.avro-source1.port = 41414
说明: 整合使用的是flume中的avro source

1.3 启动flume
bin/flume-ng agent --conf conf --conf-file conf/flume-conf.properties --name agent1 -Dflume.root.logger=INFO,console

2、验证flume产生日志
2.1 修改log4j的配置
log4j.rootLogger=INFO,flume

log4j.appender.flume = org.apache.flume.clients.log4jappender.Log4jAppender
log4j.appender.flume.Hostname = 192.168.2.20
log4j.appender.flume.Port = 41414
log4j.appender.flume.UnsafeMode = true
log4j.appender.flume.layout=org.apache.log4j.PatternLayout
log4j.appender.flume.layout.ConversionPattern= %m%n


注意:需要在项目中添加log4jappender的maven依赖、

<dependency>
     <groupId>org.apache.flume.flume-ng-clients</groupId>
     <artifactId>flume-ng-log4jappender</artifactId>
     <version>1.5.2</version>
</dependency>


说明: flume-ng 收集log4j产生的日志,是通过log4j配置来完成。生产环境中产生日志的服务和flume节点放在同一台机器上。

2.2 写一个log4j测试类,验证log4j和flume是否整合成功

package jfyun.log;

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;

public class LogProducer {

    public static void main(String[] args){
          while(true ){
             Log logger = LogFactory.getLog(LogProducer.class);
              try {
                  long s1   = System.currentTimeMillis();
                 Thread. sleep(1000);
                  long s2 = System.currentTimeMillis() - s1;
                  logger.info( "省公司鉴权接口:" +"http://bj.auth.com" +",响应时间:" +s2 +",当前时间:" +System.currentTimeMillis ());
             } catch (InterruptedException e ) {
                  e.printStackTrace();
             }
         }
    }
}


2.3 查看Flume的控制台,发现不断的输出日志数据,表明整合成功
2015-09-29 03:44:52,572 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:70)] Event: { headers:{flume.client.log4j.log.level=20000,
flume.client.log4j.message.encoding=UTF8, flume.client.log4j.logger.name=jfyun.log.LogProducer, flume.client.log4j.timestamp=1443523492933} body: E7 9C 81 E5 85 AC E5 8F B8 E9 89 B4 E6 9D 83 E6 ................ }

2015-09-29 03:44:53,586 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:70)] Event: { headers:{flume.client.log4j.log.level=20000, flume.client.log4j.message.encoding=UTF8, flume.client.log4j.logger.name=jfyun.log.LogProducer,
flume.client.log4j.timestamp=1443523493947} body: E7 9C 81 E5 85 AC E5 8F B8 E9 89 B4 E6 9D 83 E6 ................ }

2015-09-29 03:44:54,603 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:70)] Event: { headers:{flume.client.log4j.log.level=20000, flume.client.log4j.message.encoding=UTF8, flume.client.log4j.logger.name=jfyun.log.LogProducer,
flume.client.log4j.timestamp=1443523494961} body: E7 9C 81 E5 85 AC E5 8F B8 E9 89 B4 E6 9D 83 E6 ................ }

2015-09-29 03:44:55,617 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:70)] Event: { headers:{flume.client.log4j.log.level=20000, flume.client.log4j.message.encoding=UTF8, flume.client.log4j.logger.name=jfyun.log.LogProducer,
flume.client.log4j.timestamp=1443523495975} body: E7 9C 81 E5 85 AC E5 8F B8 E9 89 B4 E6 9D 83 E6 ................ }

其中: 红色部分为收集到的日志数据。
内容来自用户分享和网络整理,不保证内容的准确性,如有侵权内容,可联系管理员处理 点击这里给我发消息
标签: