Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/powershell/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
将log4net级别字段添加到logstash.conf文件_Logstash_Logstash Configuration - Fatal编程技术网

将log4net级别字段添加到logstash.conf文件

将log4net级别字段添加到logstash.conf文件,logstash,logstash-configuration,Logstash,Logstash Configuration,我正在尝试添加LEVEL字段,以便它显示在Kibana中。我的logstash.conf 输入: 2018-03-18 15:43:40.7914 - INFO: Tick 2018-03-18 15:43:40.7914 - ERROR: Tock 文件: 这将打印出level而不是INFO/ERROR等 编辑: 输入: 配置: # Sample Logstash configuration for creating a simple # Beats -> Logstash ->

我正在尝试添加LEVEL字段,以便它显示在Kibana中。我的logstash.conf

输入:

2018-03-18 15:43:40.7914 - INFO: Tick 
2018-03-18 15:43:40.7914 - ERROR: Tock
文件:

这将打印出level而不是INFO/ERROR等

编辑: 输入:

配置:

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
  }
}
filter {
  grok {      
      match => { "message" => "(?m)^%{TIMESTAMP_ISO8601:timestamp}~~\[%{DATA:thread}\]~~\[%{DATA:user}\]~~\[%{DATA:requestId}\]~~\[%{DATA:userHost}\]~~\[%{DATA:requestUrl}\]~~%{DATA:level}~~%{DATA:logger}~~%{DATA:logmessage}~~%{DATA:exception}\|\|" }
      add_field => { 
        "received_at" => "%{@timestamp}" 
        "received_from" => "%{host}"
      } 
    }
  grok {      
      match => { "message" => "- %{LOGLEVEL:level}" }
      remove_field => ["message"]      
    }
  date {
    match => [ "timestamp", "yyyy-MM-dd HH:mm:ss:SSS" ]
  }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    sniffing => true
    index => "filebeat-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
    #user => "elastic"
    #password => "changeme"
  }
  stdout { codec => rubydebug }
}
  add_field => { 
    "received_at" => "%{@timestamp}" 
    "received_from" => "%{host}"
    "level" => "levell"
  }
我得到了输出。在以下位置和级别仍然缺少已接收的\u:

在配置的该部分:

# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.

input {
  beats {
    port => 5044
  }
}
filter {
  grok {      
      match => { "message" => "(?m)^%{TIMESTAMP_ISO8601:timestamp}~~\[%{DATA:thread}\]~~\[%{DATA:user}\]~~\[%{DATA:requestId}\]~~\[%{DATA:userHost}\]~~\[%{DATA:requestUrl}\]~~%{DATA:level}~~%{DATA:logger}~~%{DATA:logmessage}~~%{DATA:exception}\|\|" }
      add_field => { 
        "received_at" => "%{@timestamp}" 
        "received_from" => "%{host}"
      } 
    }
  grok {      
      match => { "message" => "- %{LOGLEVEL:level}" }
      remove_field => ["message"]      
    }
  date {
    match => [ "timestamp", "yyyy-MM-dd HH:mm:ss:SSS" ]
  }
}
output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    sniffing => true
    index => "filebeat-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
    #user => "elastic"
    #password => "changeme"
  }
  stdout { codec => rubydebug }
}
  add_field => { 
    "received_at" => "%{@timestamp}" 
    "received_from" => "%{host}"
    "level" => "levell"
  }
使用level=>levell时,只需将字符串level放在字段level中。要放置名为levell的字段的值,必须使用%{levell}。所以在你的情况下,它看起来像:

  add_field => { 
    "received_at" => "%{@timestamp}" 
    "received_from" => "%{host}"
    "level" => "%{levell}"
  }
还有grokmatch,根据:

一个散列,用于定义要查看的位置和模式的映射


因此,试图在水平场上进行匹配是行不通的,因为它看起来还不存在。您用于匹配消息字段的grok模式与您提供的示例不匹配。

那么应该是这样吗?级别=>%{DATA:level}在哪个过滤器中?如果是针对grok筛选器,则您发布的注释中的关键级别是字段的名称;在过滤器中使用之前,该字段必须存在。因此,我如何使其工作,以使此行中的错误:2018-03-18 15:43:40.7914-错误:Tock存储到level中?grok{match=>{message=>-%{LOGLEVEL:level}}}您可以看到编辑,仍然无法工作,并且现在接收到的和接收到的都丢失了。我仍然没有看到级别:。你能帮忙吗