Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/node.js/40.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash 这个定制日志模式的Grok模式是什么?_Logstash_Logstash Grok - Fatal编程技术网

Logstash 这个定制日志模式的Grok模式是什么?

Logstash 这个定制日志模式的Grok模式是什么?,logstash,logstash-grok,Logstash,Logstash Grok,以下是我日志的一小部分: 2018-12-06 18:55:20 INFO epo - myfile.xml is loaded successfully 2018-12-06 18:55:20 INFO epo - checking that whether the given file name is already present 2018-12-06 18:55:20 INFO epo - some logging deatils 2018-12-06 18:55:20 INFO

以下是我日志的一小部分:

2018-12-06 18:55:20 INFO  epo - myfile.xml is loaded successfully
2018-12-06 18:55:20 INFO  epo - checking that whether the given file name is already present
2018-12-06 18:55:20 INFO  epo - some logging deatils
2018-12-06 18:55:20 INFO  epo - Entry has been added to table.
2018-12-06 18:55:20 INFO  epo - Total number of records processed 0000035
2018-12-06 18:55:20 INFO  epo - some logging deatils
2018-12-07 09:57:59 INFO  epo - myfile.xml is loaded successfully
2018-12-07 09:57:59 INFO  epo - [ElasticSearch] => PIN07122018F00001 request sent successfully.
2018-12-06 18:55:20 INFO  epo - myfile.xml is loaded successfully
2018-12-06 18:55:20 INFO  epo - checking that whether the given file name is already present
2018-12-06 18:55:20 INFO  epo - some logging deatils
2018-12-06 18:55:20 INFO  epo - Entry has been added to table.
2018-12-06 18:55:20 INFO  epo - Total number of records processed 0000035
2018-12-06 18:55:20 INFO  epo - some logging deatils
2018-12-07 09:57:59 INFO  epo - myfile.xml is loaded successfully
2018-12-07 09:57:59 INFO  epo - [ElasticSearch] => PIN07122018F00002 request sent unsuccessfully.
在这个日志中,我想选择包含请求ID的行,如PIN07122018F00001和PIN07122018F00002,并将其发送到elastic Search

为此,我使用logstash,我的grok模式是:

input {
  . . .
}

filter {
  grok {
    patterns_dir => ["/myServer/mnt/appln/folder1/folder2/logstash/pattern"]
    match => { "message" => '^%{TIMESTAMP_ISO8601:timestamp} INFO  epo - \[ElasticSearch\] => %{REQ_ID:requestid} %{MSG:statusmsg}$' }
  }
}

output{
    . . .
}
其中,存托机构请求ID和消息定义为:

MSG (A-Za-z0-9 )+
REQ_ID PIN[0-9]{8}[A-Z]{1}[0-9]{5}
但我仍然无法匹配所需的行,因为这种模式占用了所有行。 请告诉我与该线匹配的图案:

2018-12-07 09:57:59信息epo-[ElasticSearch]=>PIN07122018F00001 请求已成功发送


请帮助。

问题在于
MSG
模式。
()
表示一个捕获组,该组将尝试匹配
()
的确切内容。您希望在案例中使用的是
[]
,它表示一个字符类,它将匹配该类中的所有字符。此外,它还缺少出现在行尾的

您的模式应该这样定义,这样可以解决您的问题:

MSG [A-Za-z0-9 \.]+

我仍然无法选择所需的日志行,我使用的模式是
:“message”=>“^%{TIMESTAMP\u ISO8601:TIMESTAMP}INFO-epo-\[ElasticSearch\]=>PIN[0-9]{A-Z]{1-9]{5}[A-Za-z0-9\.+$”
@prakarverma对于此模式,日志中的
INFO
epo
之间有两个空格,这就是你的模式所缺少的。对于日志中空格数可能不总是相同的位置,您可以使用
\s*
谢谢!成功了。我像这样使用它:
MSG[A-Za-z0-9\.]+