Logstash仅发送json
我正在通过logstash发送日志Logstash仅发送json,json,logstash,logstash-grok,Json,Logstash,Logstash Grok,我正在通过logstash发送日志 2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup"
2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup":"education","eventName":"insert","eventData":{"education_insert":{"type":"course","data":{"education_id":2055,"education":{"id":2055,"customer_id":1,"creator_id":332,"type":"course","status":"new","is_featured":false,"enroll_deadline":null,"complete_deadline":null,"count_view":0,"count_like":0,"meta_title":"test Course - progress","meta_description":"test Course - progress","discoverable":"everyone","progress_max":0,"instructor_ids":[332],"tag_ids":[135],"discoverable_group_ids":[],"category_ids":[14],"audits":null,"instructors":null,"creator":null,"lessonGroups":null,"categories":null},"duration":"quick"}}},"scopeType":"education","scopeId":"2055"}
如何删除
2017-02-27T13:00:07+01:00
和test.app.event
您需要使用grok
提取消息的json部分,然后使用json
过滤器将提取的json转换为事件。最后,您需要使用mutate
删除您不希望在最终事件中使用的任何字段(例如,message
)。您可以使用regex模式,以便只获取json
。regex
模式应位于模式文件中:
该模式可能看起来像:
REQUIREDDATA {([^}]*)}([^}]*)([^}]*)}}}([^}]*)} <-- this extracts only your json part
现在,该消息只有日志行中的JSON
部分,因此现在可以通过logstash
将它们推送到ES。以上只是一个样本,因此您可以复制
希望有帮助 我用这个为我工作:)谢谢你的帮助
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
}
json {
source => "json"
remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
}
}
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}
转到你说的“发送日期时间,然后测试,然后json”的那一行,删除前两项:)要获得更详细的答案,你可能需要在你正在做的事情中添加一些细节,因为调试这个几乎是不可能的:)我只需要发送json数据,但我不知道如何使用filterIn grok删除logstash中的前两项我必须使用“匹配” ?
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
}
json {
source => "json"
remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
}
}
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}