elasticsearch logstash使用斜杠写入相同的日志,elasticsearch,logstash,logstash-grok,elasticsearch,Logstash,Logstash Grok" /> elasticsearch logstash使用斜杠写入相同的日志,elasticsearch,logstash,logstash-grok,elasticsearch,Logstash,Logstash Grok" />

elasticsearch logstash使用斜杠写入相同的日志

elasticsearch logstash使用斜杠写入相同的日志,elasticsearch,logstash,logstash-grok,elasticsearch,Logstash,Logstash Grok,我有以下格式的IIS日志: 172.24.54.12, -, 1/16/2016, 0:00:25, W3SVC1, DWEB420NTV, 172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -, 172.24.54.11, -, 1/16/2016, 0:00:29, W3SVC1, DWEB420NTV, 172.24.55.45, 15, 62, 284, 200, 0, GET, /keepalive.html, -, 172

我有以下格式的IIS日志:

172.24.54.12, -, 1/16/2016, 0:00:25, W3SVC1, DWEB420NTV, 172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,
172.24.54.11, -, 1/16/2016, 0:00:29, W3SVC1, DWEB420NTV, 172.24.55.45, 15, 62, 284, 200, 0, GET, /keepalive.html, -,
172.24.54.12, -, 1/16/2016, 0:00:55, W3SVC1, DWEB420NTV, 172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,
172.24.54.11, -, 1/16/2016, 0:00:59, W3SVC1, DWEB420NTV, 172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,
172.24.54.12, -, 1/16/2016, 0:01:25, W3SVC1, DWEB420NTV, 172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,
我的日志存储配置如下:

input {

  file {
    type => "iis"
    path => "C:/logstash-2.1.1/TestDataLatest/*.log"

  }
}

filter {

  if [message] =~ "^#" {
    drop {}
  }

  grok {
     match => ["message", "%{IP:ClientIP}, %{USER:UserName}, %{DATE:RequestDate}, %{TIME:RequestTime}, %{WORD:MSSVC}, %{WORD:ServerName}, %{IP: ServerIP}, %{NUMBER:ProcessingTime}, %{NUMBER:RequestBytes}, %{NUMBER: ResponseBytes}, %{NUMBER: HttpStatusCode}, %{NUMBER: HttpSubStatusCode}, %{WORD:HttpVerb}, %{GREEDYDATA:RequestUri}, %{GREEDYDATA:QueryParam}"]
  }
}

output {

  stdout { codec => rubydebug }
  file {
    path => "C:/logstash-2.1.1/TestDataLatest/output.log"
}

}
{"message":"172.24.54.12, -, 1/16/2016, 0:03:55, W3SVC1, DWEB420NTV,
172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,\r","@version":"1","@timestamp":"2016-01-19T20:00:51.803Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/u_in160116.log","type":"iis","ClientIP":"172.24.54.12","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:03:55","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"0","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\r"} {"message":"172.24.54.12, -, 1/16/2016, 0:01:25, W3SVC1, DWEB420NTV,
172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,\r","@version":"1","@timestamp":"2016-01-19T20:00:51.798Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/u_in160116.log","type":"iis","ClientIP":"172.24.54.12","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:01:25","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"0","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\r"} {"message":"{\"message\":\"172.24.54.11, -, 1/16/2016, 0:00:29, W3SVC1, DWEB420NTV, 172.24.55.45, 15, 62, 284, 200, 0, GET, /keepalive.html,
-,\\r\",\"@version\":\"1\",\"@timestamp\":\"2016-01-19T20:00:51.797Z\",\"host\":\"RB102179\",\"path\":\"C:/logstash-2.1.1/TestDataLatest/u_in160116.log\",\"type\":\"iis\",\"ClientIP\":\"172.24.54.11\",\"UserName\":\"-\",\"RequestDate\":\"1/16/2016\",\"RequestTime\":\"0:00:29\",\"MSSVC\":\"W3SVC1\",\"ServerName\":\"DWEB420NTV\",\"ProcessingTime\":\"15\",\"RequestBytes\":\"62\",\"HttpVerb\":\"GET\",\"RequestUri\":\"/keepalive.html\",\"QueryParam\":\"-,\\r\"}\r","@version":"1","@timestamp":"2016-01-19T20:01:04.871Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/output.log","type":"iis","ClientIP":"172.24.54.11","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:00:29","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"15","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\\r\",\"@version\":\"1\",\"@timestamp\":\"2016-01-19T20:00:51.797Z\",\"host\":\"RB102179\",\"path\":\"C:/logstash-2.1.1/TestDataLatest/u_in160116.log\",\"type\":\"iis\",\"ClientIP\":\"172.24.54.11\",\"UserName\":\"-\",\"RequestDate\":\"1/16/2016\",\"RequestTime\":\"0:00:29\",\"MSSVC\":\"W3SVC1\",\"ServerName\":\"DWEB420NTV\",\"ProcessingTime\":\"15\",\"RequestBytes\":\"62\",\"HttpVerb\":\"GET\",\"RequestUri\":\"/keepalive.html\",\"QueryParam\":\"-,\\r\"}\r"}
grok过滤器最初似乎工作正常,但在完成日志读取之后。Logstash开始用斜杠反复写入相同的日志。我似乎不明白为什么会这样。输出如下所示:

input {

  file {
    type => "iis"
    path => "C:/logstash-2.1.1/TestDataLatest/*.log"

  }
}

filter {

  if [message] =~ "^#" {
    drop {}
  }

  grok {
     match => ["message", "%{IP:ClientIP}, %{USER:UserName}, %{DATE:RequestDate}, %{TIME:RequestTime}, %{WORD:MSSVC}, %{WORD:ServerName}, %{IP: ServerIP}, %{NUMBER:ProcessingTime}, %{NUMBER:RequestBytes}, %{NUMBER: ResponseBytes}, %{NUMBER: HttpStatusCode}, %{NUMBER: HttpSubStatusCode}, %{WORD:HttpVerb}, %{GREEDYDATA:RequestUri}, %{GREEDYDATA:QueryParam}"]
  }
}

output {

  stdout { codec => rubydebug }
  file {
    path => "C:/logstash-2.1.1/TestDataLatest/output.log"
}

}
{"message":"172.24.54.12, -, 1/16/2016, 0:03:55, W3SVC1, DWEB420NTV,
172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,\r","@version":"1","@timestamp":"2016-01-19T20:00:51.803Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/u_in160116.log","type":"iis","ClientIP":"172.24.54.12","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:03:55","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"0","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\r"} {"message":"172.24.54.12, -, 1/16/2016, 0:01:25, W3SVC1, DWEB420NTV,
172.24.55.45, 0, 62, 284, 200, 0, GET, /keepalive.html, -,\r","@version":"1","@timestamp":"2016-01-19T20:00:51.798Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/u_in160116.log","type":"iis","ClientIP":"172.24.54.12","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:01:25","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"0","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\r"} {"message":"{\"message\":\"172.24.54.11, -, 1/16/2016, 0:00:29, W3SVC1, DWEB420NTV, 172.24.55.45, 15, 62, 284, 200, 0, GET, /keepalive.html,
-,\\r\",\"@version\":\"1\",\"@timestamp\":\"2016-01-19T20:00:51.797Z\",\"host\":\"RB102179\",\"path\":\"C:/logstash-2.1.1/TestDataLatest/u_in160116.log\",\"type\":\"iis\",\"ClientIP\":\"172.24.54.11\",\"UserName\":\"-\",\"RequestDate\":\"1/16/2016\",\"RequestTime\":\"0:00:29\",\"MSSVC\":\"W3SVC1\",\"ServerName\":\"DWEB420NTV\",\"ProcessingTime\":\"15\",\"RequestBytes\":\"62\",\"HttpVerb\":\"GET\",\"RequestUri\":\"/keepalive.html\",\"QueryParam\":\"-,\\r\"}\r","@version":"1","@timestamp":"2016-01-19T20:01:04.871Z","host":"RB102179","path":"C:/logstash-2.1.1/TestDataLatest/output.log","type":"iis","ClientIP":"172.24.54.11","UserName":"-","RequestDate":"1/16/2016","RequestTime":"0:00:29","MSSVC":"W3SVC1","ServerName":"DWEB420NTV","ProcessingTime":"15","RequestBytes":"62","HttpVerb":"GET","RequestUri":"/keepalive.html","QueryParam":"-,\\r\",\"@version\":\"1\",\"@timestamp\":\"2016-01-19T20:00:51.797Z\",\"host\":\"RB102179\",\"path\":\"C:/logstash-2.1.1/TestDataLatest/u_in160116.log\",\"type\":\"iis\",\"ClientIP\":\"172.24.54.11\",\"UserName\":\"-\",\"RequestDate\":\"1/16/2016\",\"RequestTime\":\"0:00:29\",\"MSSVC\":\"W3SVC1\",\"ServerName\":\"DWEB420NTV\",\"ProcessingTime\":\"15\",\"RequestBytes\":\"62\",\"HttpVerb\":\"GET\",\"RequestUri\":\"/keepalive.html\",\"QueryParam\":\"-,\\r\"}\r"}

正如您所看到的,它以干净的方式记录日志,然后继续使用额外的斜杠写入相同的日志。我使用的是Logstash2.1.1

我刚刚意识到这是一件愚蠢的事情,因为我的输入和输出日志都在同一个目录中。当我读取带有通配符的日志文件时,Logstash在修改后的输出日志上一遍又一遍地运行

我刚刚意识到这是一件愚蠢的事情,因为我的输入和输出日志都在同一个目录中。当我读取带有通配符的日志文件时,Logstash在修改后的输出日志上一遍又一遍地运行

尝试覆盖消息,如下面的代码所示:

  grok {
    break_on_match => false
    match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:s_sitename}" }
    overwrite => [ "message" ]
    {

请尝试按以下代码覆盖此消息:

  grok {
    break_on_match => false
    match => { "message" => "%{TIMESTAMP_ISO8601:log_timestamp} %{WORD:s_sitename}" }
    overwrite => [ "message" ]
    {