Logstash 日志存储未将日志拆分为配置
我正在使用elkb处理我的访问日志。有一天,我发现kibana丢了一根木头 然后,grepLogstash 日志存储未将日志拆分为配置,logstash,Logstash,我正在使用elkb处理我的访问日志。有一天,我发现kibana丢了一根木头 然后,grepfilebeatlog,我可以找到丢失的日志: 2017/03/01 10:19:20.096711 client.go:184: DBG Publish: { "@timestamp": "2017-03-01T10:19:16.327Z", "beat": { "hostname": "kvm980156.jx.diditaxi.com", "name": "kvm980156
filebeat
log,我可以找到丢失的日志:
2017/03/01 10:19:20.096711 client.go:184: DBG Publish: {
"@timestamp": "2017-03-01T10:19:16.327Z",
"beat": {
"hostname": "kvm980156.jx.diditaxi.com",
"name": "kvm980156.jx.diditaxi.com",
"version": "5.0.0"
},
"input_type": "log",
"message": "2017-03-01 18:19:11.699|10.94.104.169|17714317657896955-151|1|wangziyi|Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36|POST|/api/v1/answer/|com.didi.km.api.controller.api.v1.quest
ion.AnswerController#post[2 args]|{\"questionId\":[\"145\"],\"content\":[\"\u003cp\u003e123123123123123\u003c/p\u003e\"]}|200|220",
"offset": 1723505,
"source": "/home/km/didi-km-api/logs/km-access.2017-03-01.log",
"type": "log"
}
{
"controllerMethod" => "com.didi.km.api.controller.api.v1.question.AnswerController#answersOrderByHot[2 args]",
"offset" => 1723849,
"method" => "GET",
"input_type" => "log",
"source" => "/home/km/didi-km-api/logs/km-access.2017-03-01.log",
"message" => "2017-03-01 18:19:11.855|10.94.104.169|17714317657896955-152|1|wangziyi|Mozilla/5.0 (Macintosh; Intel Mac O
S X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36|GET|/api/v1/answer/145|com.didi.km.api.controll
er.api.v1.question.AnswerController#answersOrderByHot[2 args]|{\"order\":[\"hot\"],\"pager\":[\"1,100\"]}|200|60",
"type" => "log",
"ua" => "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87
Safari/537.36",
"uri" => "/api/v1/answer/145",
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"uid" => 1,
"@timestamp" => 2017-03-01T10:19:11.855Z,
"param" => "{\"order\":[\"hot\"],\"pager\":[\"1,100\"]}",
"costTime" => 60,
"requestID" => "17714317657896955-152",
"host-ip" => "10.94.104.169",
"@version" => "1",
"beat" => {
"hostname" => "kvm980156.jx.diditaxi.com",
"name" => "kvm980156.jx.diditaxi.com",
"version" => "5.0.0"
},
"host" => "kvm980156.jx.diditaxi.com",
"time" => "2017-03-01 18:19:11.855",
"username" => "wangziyi",
"statusCode" => 200
}
而且,我greplogstash
log,我也能找到它:
{
"@timestamp" => 2017-03-01T10:19:16.327Z,
"offset" => 1723505,
"@version" => "1",
"input_type" => "log",
"beat" => {
"hostname" => "kvm980156.jx.diditaxi.com",
"name" => "kvm980156.jx.diditaxi.com",
"version" => "5.0.0"
},
"host" => "kvm980156.jx.diditaxi.com",
"source" => "/home/km/didi-km-api/logs/km-access.2017-03-01.log",
"message" => "2017-03-01 18:19:11.699|10.94.104.169|17714317657896955-151|1|wangziyi|Mozilla/5.0 (Macintosh; Intel Mac OS X 10
_12_2) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36|POST|/api/v1/answer/|com.didi.km.api.controller.api.v
1.question.AnswerController#post[2 args]|{\"questionId\":[\"145\"],\"content\":[\"<p>123123123123123</p>\"]}|200|220",
"type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
]
}
这是我的logstash
config。使用grok
拆分日志
input {
beats {
port => "5043"
}
}
filter {
# TIME||HOST-IP||REQUEST-ID||UID||USERNAME||METHOD||URI||CONTROLLER-METHOD||PARAMS-MAP
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:time}\|%{IP:host-ip}\|(?<requestID>\d+-\d+)\|%{INT:uid:int}\|%{WORD:username}\|(?<ua>(\
w|\/|\.|\s|\(|;|\)|,)+)\|%{WORD:method}\|(?<uri>(\w|\/)+)\|(?<controllerMethod>(\w|\d|\s|\.|#|\[|\])+)\|(?<param>(\w|{|}|\"|\:|\[|\]|
\,)+)\|%{NUMBER:statusCode:int}\|%{NUMBER:costTime:int}"
}
}
date {
match => ["time", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "@timestamp"
}
}
output {
stdout { codec => rubydebug }
elasticsearch {
hosts => [ "10.94.66.193:9200" ]
index => "km-access-%{+YYYY.MM.dd}"
}
}
我可以看出,您只是试图提取日志行的
时间戳
部分,然后匹配它。如果是这样的话,那么如果您的grok匹配没有让它变得更复杂,该怎么办
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:time}%{GREEDYDATA}"
}
}
date {
match => ["time", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "@timestamp"
}
该事件被标记为\u grokparsefailure
,这表明您的grok
筛选器无法匹配它。非常感谢,我将尝试使用查找它。也许我的日志里有错误的信息。
grok {
match => {
"message" => "%{TIMESTAMP_ISO8601:time}%{GREEDYDATA}"
}
}
date {
match => ["time", "yyyy-MM-dd HH:mm:ss.SSS"]
target => "@timestamp"
}