Logstash 文件输入添加\字段未向每行添加字段

Logstash 文件输入添加\字段未向每行添加字段,logstash,Logstash,我正在用logstash配置解析不同负载平衡服务器集群的几个日志文件,并希望在每个文件的条目中添加一个字段“log_origin”,以便以后进行简单筛选 下面是一个简单示例中的输入->文件配置: input { file { type => "node1" path => "C:/Development/node1/log/*" add_field => [ "log_origin", "live_logs" ] } file { t

我正在用logstash配置解析不同负载平衡服务器集群的几个日志文件,并希望在每个文件的条目中添加一个字段“log_origin”,以便以后进行简单筛选

下面是一个简单示例中的输入->文件配置:

input {
  file {
    type => "node1"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node2"
    path => "C:/Development/node2/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node3"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
  file {
    type => "node4"
    path => "C:/Development/node1/log/*"
    add_field => [ "log_origin", "live_logs" ]
  }
}

filter {
    grok {
        match => [
            "message","%{DATESTAMP:log_timestamp}%{SPACE}\[%{DATA:class}\]%{SPACE}%{LOGLEVEL:loglevel}%{SPACE}%{GREEDYDATA:log_message}"
        ]
    }

    date { 
        match => [ "log_timestamp",  "dd.MM.YY HH:mm:ss", "ISO8601" ]
        target => "@timestamp"
    }

    mutate {
        lowercase => ["loglevel"]
        strip     => ["loglevel"]
    }

    if "_grokparsefailure" in [tags] {
        multiline {
            pattern   => ".*"
            what      => "previous"
        }
    }

    if[fields.log_origin] == "live_logs"{
        if [type] == "node1" {
            mutate { 
                add_tag => "realsServerName1"
            }
        }
        if [type] == "node2" {
            mutate { 
                add_tag => "realsServerName2"
            }
        }
        if [type] == "node3" {
            mutate { 
                add_tag => "realsServerName3"
             }
        }
        if [type] == "node4" {
            mutate { 
                add_tag => "realsServerName4"
            }
        }
    }
}

output {
  stdout { }
  elasticsearch { embedded => true }
}
我本来希望logstash使用它找到的每个日志条目的值来添加这个字段,但它没有。也许我在这里采取了完全错误的方法

编辑:我无法直接从节点检索日志,但必须将它们复制到我的“服务器”。否则我就可以使用文件路径来区分不同的集群


编辑:它在工作。我应该清理中间的数据。没有添加字段的旧条目会把我的结果弄乱。

添加字段需要散列。应该是

add_field => {
 "log_origin" => "live_logs" 
}

add_字段需要一个散列。应该是

add_field => {
 "log_origin" => "live_logs" 
}