Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/json/13.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/0/windows/16.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash仅发送json_Json_Logstash_Logstash Grok - Fatal编程技术网

Logstash仅发送json

Logstash仅发送json,json,logstash,logstash-grok,Json,Logstash,Logstash Grok,我正在通过logstash发送日志 2017-02-27T13:00:07+01:00 test {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup"

我正在通过logstash发送日志

2017-02-27T13:00:07+01:00    test    {"createdAt":"2017-02-27T13:00:07+0100","cluster":"undefined","nodeName":"undefined","nodeIP":"10.11.11.50","clientIP":"10.11.11.72","customerId":1,"identityId":332,"appType":"admin","eventGroup":"education","eventName":"insert","eventData":{"education_insert":{"type":"course","data":{"education_id":2055,"education":{"id":2055,"customer_id":1,"creator_id":332,"type":"course","status":"new","is_featured":false,"enroll_deadline":null,"complete_deadline":null,"count_view":0,"count_like":0,"meta_title":"test Course - progress","meta_description":"test Course - progress","discoverable":"everyone","progress_max":0,"instructor_ids":[332],"tag_ids":[135],"discoverable_group_ids":[],"category_ids":[14],"audits":null,"instructors":null,"creator":null,"lessonGroups":null,"categories":null},"duration":"quick"}}},"scopeType":"education","scopeId":"2055"}

如何删除
2017-02-27T13:00:07+01:00
test.app.event

您需要使用
grok
提取消息的json部分,然后使用
json
过滤器将提取的json转换为事件。最后,您需要使用
mutate
删除您不希望在最终事件中使用的任何字段(例如,
message
)。

您可以使用regex模式,以便只获取
json
regex
模式应位于模式文件中:

该模式可能看起来像:

REQUIREDDATA {([^}]*)}([^}]*)([^}]*)}}}([^}]*)} <-- this extracts only your json part
现在,该消息只有日志行中的
JSON
部分,因此现在可以通过
logstash
将它们推送到ES。以上只是一个样本,因此您可以复制


希望有帮助

我用这个为我工作:)谢谢你的帮助

input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
  }
  json {
     source => "json"
     remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
  }
 }
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}

转到你说的“发送日期时间,然后测试,然后json”的那一行,删除前两项:)要获得更详细的答案,你可能需要在你正在做的事情中添加一些细节,因为调试这个几乎是不可能的:)我只需要发送json数据,但我不知道如何使用filterIn grok删除logstash中的前两项我必须使用“匹配” ?
input { kafka { bootstrap_servers=>"localhost:9092" topics=>"test"}}
filter{
grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}\t%{GREEDYDATA:topic}\t%{GREEDYDATA:json}" }
  }
  json {
     source => "json"
     remove_field => ["timestamp","topic","json","message","@version","@timestamp","tags"]
  }
 }
output{ elasticsearch {hosts=>["127.0.0.1:9200"] document_type=>"app_stats" index=>"test"}}