关于logstash-out-mongodb插件说明
2016-09-13 17:17
417 查看
从kafka获取数据,存到mongodb中。适合空间查询geo_point设置。配置文件如下:
input {
kafka {
type => "test"
auto_offset_reset => "smallest"
group_id => "m2"
topic_id => "db100"
zk_connect => "192.168.199.6:2181,192.168.199.7:2181,192.168.199.8:2181"
}
}
filter {
mutate {
split => { "message" => "," }
add_field => {
"id" => "%{message[1]}"
"SYZTDM_s" => "%{message[55]}"
"lat" => "%{message[6]}"
"lon" => "%{message[7]}"
"loc" => "%{message[6]}"
}
remove_field => [ "message" ]
remove_field => [ "path" ]
remove_field => [ "host" ]
remove_field => [ "type" ]
}
mutate{
convert => {"lat" => "float"}
convert => { "lon" => "float"}
convert => {"loc" => "float"}
}
mutate{
merge => {"loc" =>"lon"}
}
}
output {
mongodb {
collection => "base"
database => "fragment"
uri => "mongodb://192.168.199.7:27017"
}
}
注意:
1.logstash需要安装mongodb插件,默认没有安装的。(bin/logstash-plugin install
logstash-output-mongodb)
2.插入方式是insert方式,是单个插入。
3.geo_point查询方式是数组类型的。
input {
kafka {
type => "test"
auto_offset_reset => "smallest"
group_id => "m2"
topic_id => "db100"
zk_connect => "192.168.199.6:2181,192.168.199.7:2181,192.168.199.8:2181"
}
}
filter {
mutate {
split => { "message" => "," }
add_field => {
"id" => "%{message[1]}"
"SYZTDM_s" => "%{message[55]}"
"lat" => "%{message[6]}"
"lon" => "%{message[7]}"
"loc" => "%{message[6]}"
}
remove_field => [ "message" ]
remove_field => [ "path" ]
remove_field => [ "host" ]
remove_field => [ "type" ]
}
mutate{
convert => {"lat" => "float"}
convert => { "lon" => "float"}
convert => {"loc" => "float"}
}
mutate{
merge => {"loc" =>"lon"}
}
}
output {
mongodb {
collection => "base"
database => "fragment"
uri => "mongodb://192.168.199.7:27017"
}
}
注意:
1.logstash需要安装mongodb插件,默认没有安装的。(bin/logstash-plugin install
logstash-output-mongodb)
2.插入方式是insert方式,是单个插入。
3.geo_point查询方式是数组类型的。
相关文章推荐
- 使用ElasticSearch+LogStash+Kibana+Redis搭建日志管理服务
- logstash mysql slow
- LogStash 安装配置初体验
- logStash收集日志并存储到Elasticsearch
- Logstash 与Elasticsearch整合使用示例
- ELK(ElasticSearch, Logstash, Kibana)搭建实时日志分析平台
- ELK(ElasticSearch, Logstash, Kibana)搭建实时日志分析平台
- logstash
- logstash,elasticsearch,kibana三件套
- 用Kibana和logstash快速搭建实时日志查询、收集与分析系
- logstash
- logstash过滤nginx日志
- 配置Logstash(1) — 配置文件的结构
- 配置Logstash(2) — “事件”相关配置
- Logstash扩展开发 - Input 与 Codec
- ELK 索引抽取模板(中文索引配置not_analyzed,才能在kibana中使用terms)
- ELK完整搭建流程(小白入门记)
- ELK—>logstash—>ruby·plugin—>实现精彩的功能(term模板抽取)
- elk 日志分析系统Logstash+ElasticSearch+Kibana4
- logstash收集log4j日志