elasticsearch 要在Logstash中解析的多个模式,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok" /> elasticsearch 要在Logstash中解析的多个模式,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok" />

elasticsearch 要在Logstash中解析的多个模式

elasticsearch 要在Logstash中解析的多个模式,elasticsearch,logstash,elastic-stack,logstash-grok,elasticsearch,Logstash,Elastic Stack,Logstash Grok,我的日志文件有多个模式,包括JSON格式的日志。我想在grok插件中解析多个模式,但它似乎不起作用 'filter {grok { break_on_match => false match =>[ "message", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}",

我的日志文件有多个模式,包括JSON格式的日志。我想在grok插件中解析多个模式,但它似乎不起作用

'filter {grok {  break_on_match => false 
match =>[ "message", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}",
           "message","%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{IP:Clicnet} - - %{GREEDYDATA:Line}"]}
           json {source => "Line"}mutate{remove_field => [ "Line","ThreadID" ]}}'
第二行没有JSON,完全失败了

2017-01-20 15:46:16信息请求日志:60-10.252.134.34---[20/Jan/2017:15:46:16+0000]“选项//127.0.0.0:8080/HTTP/1.1”404 237 1

Error parsing json {:source=>"Line", :raw=>["10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237  1", "[20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237  1"], :exception=>java.lang.ClassCastException: org.jruby.RubyArray cannot be cast to org.jruby.RubyIO, :level=>:warn}
{
       "message" => "2017-01-20 15:46:16 INFO  RequestLog:60 - 10.252.134.34 - - [20/Jan/2017:15:46:16 +0000] \"OPTIONS //127.0.0.0:8080/ HTTP/1.1\" 404 237  1",
      "@version" => "1",
    "@timestamp" => "2017-03-20T17:19:51.175Z",
          "type" => "stdin",
          "host" => "ef3b82",
       "LogDate" => [
        [0] "2017-01-20 15:46:16",
        [1] "2017-01-20 15:46:16"
    ],
      "loglevel" => [
        [0] "INFO",
        [1] "INFO"
    ],
    "threadName" => [
        [0] " RequestLog",
        [1] " RequestLog"
    ],
       "Clicnet" => "10.252.134.34",
          "tags" => [
        [0] "_jsonparsefailure"
    ]
}

花了5个小时后,我终于找到了解决办法。在成功解析两条日志行的模式下使用

/opt/logstash/bin/logstash -e 'filter {grok  { match =>{ "message" =>["%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadName} - %{IP:Client} - - %{GREEDYDATA:LogMessage}", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}"]}}  json {source => "Line"} mutate{remove_field => [ "Line","ThreadID" ]}}'
/opt/logstash/bin/logstash-e'filter{grok{match=>{“message”=>[“{TIMESTAMP_ISO8601:LogDate}%{LOGLEVEL:LOGLEVEL}(?^::+):%{NUMBER:ThreadName}-%{IP:Client}%{greeddata:LogMessage},“{TIMESTAMP_ISO8601:LogDate LogDate}%{LOGLEVEL LOGLEVEL LOGLEVEL}[^:+]:%{NUMBER:ThreadID:json}-%{greedyline}源代码:}{{remove_field=>[“Line”,“ThreadID”]}
/opt/logstash/bin/logstash -e 'filter {grok  { match =>{ "message" =>["%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadName} - %{IP:Client} - - %{GREEDYDATA:LogMessage}", "%{TIMESTAMP_ISO8601:LogDate} %{LOGLEVEL:loglevel} (?<threadName>[^:]+):%{NUMBER:ThreadID} - %{GREEDYDATA:Line}"]}}  json {source => "Line"} mutate{remove_field => [ "Line","ThreadID" ]}}'