Warning: file_get_contents(/data/phpspider/zhask/data//catemap/4/jquery-ui/2.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
<img src="//i.stack.imgur.com/RUiNP.png" height="16" width="18" alt="" class="sponsor tag img">elasticsearch 过滤带或不带Logstash的Filebeat输入_<img Src="//i.stack.imgur.com/RUiNP.png" Height="16" Width="18" Alt="" Class="sponsor Tag Img">elasticsearch_Logstash_Amazon Elastic Beanstalk_Filebeat - Fatal编程技术网 elasticsearch 过滤带或不带Logstash的Filebeat输入,elasticsearch,logstash,amazon-elastic-beanstalk,filebeat,elasticsearch,Logstash,Amazon Elastic Beanstalk,Filebeat" /> elasticsearch 过滤带或不带Logstash的Filebeat输入,elasticsearch,logstash,amazon-elastic-beanstalk,filebeat,elasticsearch,Logstash,Amazon Elastic Beanstalk,Filebeat" />

elasticsearch 过滤带或不带Logstash的Filebeat输入

elasticsearch 过滤带或不带Logstash的Filebeat输入,elasticsearch,logstash,amazon-elastic-beanstalk,filebeat,elasticsearch,Logstash,Amazon Elastic Beanstalk,Filebeat,在我们当前的设置中,我们使用Filebeat将日志发送到Elasticsearch实例。应用程序日志采用JSON格式,并在AWS中运行 出于某种原因,AWS决定在新的平台版本中为日志行添加前缀,而现在日志解析无法工作 Apr 17 06:33:32 ip-172-31-35-113 web: {"@timestamp":"2020-04-17T06:33:32.691Z","@version":"1","message":"Tomcat started on port(s): 5000 (htt

在我们当前的设置中,我们使用Filebeat将日志发送到Elasticsearch实例。应用程序日志采用JSON格式,并在AWS中运行

出于某种原因,AWS决定在新的平台版本中为日志行添加前缀,而现在日志解析无法工作

Apr 17 06:33:32 ip-172-31-35-113 web: {"@timestamp":"2020-04-17T06:33:32.691Z","@version":"1","message":"Tomcat started on port(s): 5000 (http) with context path ''","logger_name":"org.springframework.boot.web.embedded.tomcat.TomcatWebServer","thread_name":"main","level":"INFO","level_value":20000}
在此之前,它只是:

{"@timestamp":"2020-04-17T06:33:32.691Z","@version":"1","message":"Tomcat started on port(s): 5000 (http) with context path ''","logger_name":"org.springframework.boot.web.embedded.tomcat.TomcatWebServer","thread_name":"main","level":"INFO","level_value":20000}
问题是我们是否可以避免使用Logstash将日志行转换为旧格式?如果没有,如何删除前缀?哪种过滤器是最佳选择

我当前的Filebeat配置如下所示:

 filebeat.inputs:
  - type: log
    paths:
    - /var/log/web-1.log
    json.keys_under_root: true
    json.ignore_decoding_error: true
    json.overwrite_keys: true
    fields_under_root: true
    fields:
      environment: ${ENV_NAME:not_set}
      app: myapp

  cloud.id: "${ELASTIC_CLOUD_ID:not_set}"
  cloud.auth: "${ELASTIC_CLOUD_AUTH:not_set}"
filter {
    dissect {
        mapping => {
            "message" => "%{}: %{message_without_prefix}"
         }
    }
}

我会尝试利用和处理器:

processors:
  # first ignore the preamble and only keep the JSON data
  - dissect:
      tokenizer: "%{?ignore} %{+ignore} %{+ignore} %{+ignore} %{+ignore}: %{json}"
      field: "message"
      target_prefix: ""

  # then parse the JSON data
  - decode_json_fields:
      fields: ["json"]
      process_array: false
      max_depth: 1
      target: ""
      overwrite_keys: false
      add_error_key: true

Logstash中有一个名为的插件,它在名为“message”的字段中包含所有原始日志行(例如)

如果不希望包含行的开头部分,请使用in Logstash。应该是这样的:

 filebeat.inputs:
  - type: log
    paths:
    - /var/log/web-1.log
    json.keys_under_root: true
    json.ignore_decoding_error: true
    json.overwrite_keys: true
    fields_under_root: true
    fields:
      environment: ${ENV_NAME:not_set}
      app: myapp

  cloud.id: "${ELASTIC_CLOUD_ID:not_set}"
  cloud.auth: "${ELASTIC_CLOUD_AUTH:not_set}"
filter {
    dissect {
        mapping => {
            "message" => "%{}: %{message_without_prefix}"
         }
    }
}

也许在Filebeat中也有这两个功能。但根据我的经验,在解析/操作日志数据时,我更喜欢使用Logstash。

我还没有测试该解决方案,但从我的角度来看,它应该可以工作。你试过了吗?现在就做,我应该把它包括在输入项下还是作为全局?我想这没关系。。。。类型:“log”,Meta:map[string]string(nil),FileStateOS:file.StateOS{Inode:0xc009ce,Device:0xca01},TimeSeries:false},Flags:0x1,Cache:publisher.EventCache{m:common.MapStr(nil)}(status=400):{“类型”:“mapper\u parsing\u exception”,“原因”:“json的对象映射试图将字段[json]解析为对象,但找到了一个具体值”}这是因为您的索引可能已经有一个名为
json
的字符串类型字段,可能使用另一个名称,如
json\u tmp
或其他任何名称