elasticsearch,openshift,fluentd,Spring,elasticsearch,Openshift,Fluentd" /> elasticsearch,openshift,fluentd,Spring,elasticsearch,Openshift,Fluentd" />

Spring FluentD无法在Elasticsearch中写入日志

Spring FluentD无法在Elasticsearch中写入日志,spring,elasticsearch,openshift,fluentd,Spring,elasticsearch,Openshift,Fluentd,使用: fluentd 1.11.2 fluent插件elasticsearch 4.1.3 elasticsearch 7.5.1 springboot 2.3.3 在Openshift中运行(Kubernetes v1.17.1+20ba474) Fluentd和Elasticsearch都在不同的豆荚中运行 Fluentd配置文件: <source> @type forward port 24224 bind 0.0.0.0 </source> &l

使用:

  • fluentd 1.11.2
  • fluent插件elasticsearch 4.1.3
  • elasticsearch 7.5.1
  • springboot 2.3.3
在Openshift中运行(Kubernetes v1.17.1+20ba474)

Fluentd和Elasticsearch都在不同的豆荚中运行

Fluentd配置文件:

<source>
  @type forward
  port 24224
  bind 0.0.0.0
</source>
<filter *.**>
      @type parser
      key_name log
      reserve_data true
      <parse>
        @type none
      </parse>
</filter>
<match *.**>
  @type copy
<store>
    @type elasticsearch
    host elasticdb
    port 9200
    logstash_format true
    logstash_prefix applogs
    logstash_dateformat %Y%m%d
    include_tag_key true
    type_name app_log
    tag_key @log_name
    flush_interval 1s
    user elastic
    password changeme
  </store>
  <store>
    @type stdout
  </store>
</match>
很明显,这十次中只有一次有效。或者似乎工作了两三次,然后中断,直到我改变索引。实际上,还不清楚行为模式

当它不工作时(大多数情况下),fluentd pod中的日志如下:

2020-09-18 17:33:08.000000000 +0000 app.appaa: {"from":"userA","to":"userB"}
2020-09-18 17:33:37 +0000 [warn]: #0 dump an error event: error_class=ArgumentError error="log does not exist" location=nil tag="fluent.warn" time=2020-09-18 17:33:37.328180192 +0000 record={"error"=>"#<ArgumentError: log does not exist>", "location"=>nil, "tag"=>"app.appaa", "time"=>1600450388, "record"=>{"from"=>"userA", "to"=>"userB"}, "message"=>"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userAa\", \"to\"=>\"userBb\"}"}
2020-09-18 17:33:37.328180192 +0000 fluent.warn: {"error":"#<ArgumentError: log does not exist>","location":null,"tag":"app.appaa","time":1600450388,"record":{"from":"userA","to":"userB"},"message":"dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.appaa\" time=1600450388 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}"}
warning: 299 Elasticsearch-7.5.1-3ae9ac9a93c95bd0cdc054951cf95d88e1e18d96 "[types removal] Specifying types in bulk requests is deprecated."
2020-09-18 17:33:08.000000000+0000 app.appaa:{“from”:“userA”,“to”:“userB”}
2020-09-18 17:33:37+0000[警告]:#0转储错误事件:error\u class=ArgumentError=“log不存在”location=nil tag=“fluent.warn”time=2020-09-18 17:33:37.328180192+0000记录={“error”=>“#”,“location”=>nil,“tag”=>“app.appa”,“time”=>1600450388,“record”=>{“from”=>“userA”,“to”=>“userB”},“message”=>“转储错误事件:error\u class=ArgumentError=\”日志不存在\“location=nil tag=\”app.appaa \“time=1600450388 record={\”from\”=>“userAa\”,\“to\”=>“userBb\”}”
2020-09-18 17:33:37.328180192+0000 fluent.warn:{“error”:“location”:null,“tag”:“app.appaa”,“time”:1600450388,“record”:{“from”:“userA”,“to”:“userB”},message:“dump a error event:error\u class=ArgumentError=\”日志不存在\“location=nil tag=\”app.appaa \“time=1600450388 record={“from=”from=>“userA\”,“to=”userB\”
警告:299 Elasticsearch-7.5.1-3AE9AC9A93C95BD0CDC05495CF95D88E18D96“[类型删除]不推荐在批量请求中指定类型。”
虽然Elasticsearch pod没有显示任何内容(我想是日志级别的问题),但如果我转到Elastic,我会看到:

{
    "_index": "applogs-20200918",
    "_type": "_doc",
    "_id": "F0M2onQBB89nIri4Cb1Z",
    "_score": 1.0,
    "_source": {
        "error": "#<ArgumentError: log does not exist>",
        "location": null,
        "tag": "app.app",
        "time": 1600449251,
        "record": {
            "from": "userA",
            "to": "userB"
        },
        "message": "dump an error event: error_class=ArgumentError error=\"log does not exist\" location=nil tag=\"app.app\" time=1600449251 record={\"from\"=>\"userA\", \"to\"=>\"userB\"}",
        "@timestamp": "2020-09-18T17:14:39.775332214+00:00",
        "@log_name": "fluent.warn"
    }
}
{
“_索引”:“applogs-20200918”,
“\u类型”:“\u单据”,
“_id”:“F0M2onQBB89nIri4Cb1Z”,
“_分数”:1.0,
“_来源”:{
“错误”:“#”,
“位置”:空,
“标签”:“app.app”,
“时间”:1600449251,
“记录”:{
“from”:“userA”,
“收件人”:“用户B”
},
“消息”:“转储错误事件:错误\类=参数错误=\”日志不存在\“位置=零标记=\”app.app \“时间=1600449251记录={\”从\“=>\”userA\,\”到\“=>”userB\”,
“@timestamp”:“2020-09-18T17:14:39.775332214+00:00”,
@log_name:“fluent.warn”
}
}
所以看起来错误来自于

“弹性:参数错误:日志不存在”


以前有人遇到过这个错误吗?

过滤器中解析器的配置,即

<filter *.**>
  @type parser
  key_name log    # << Look for key `log` in event
  # ...
</filter>
您需要使用以下内容:

{"log":"... your log here..."}
如果使用引号,您可能需要在其中转义


相关文档:

您是否使用
fluent cat
命令对此进行了测试?谢谢您的评论。没有。你认为这会有什么不同吗?在我所做的所有尝试中,我从本地服务和运行在OCP上一个单独的pod中的同一个服务攻击fluentd。因为我在Windows10上,所以我对安装fluent agent不是很有信心。但如果你认为这会有所不同,我会尝试。再次感谢,欢迎!这将隔离fluentd和elasticsearch管道。您可以在运行fluentd的同一台机器上测试它<代码>流畅的cat应该已经存在。您不必在不同的机器上单独安装它。这将有助于验证客户端服务是否与此相关;而且,管道正在工作。对<代码>键名日志。你能试试
echo'{“log”:“hello”}流畅的cat debug.log
?太棒了!是的,您应该参考文档了解配置。而且,我总是先使用
fluent cat
进行隔离测试。很高兴这有帮助。:)
<filter *.**>
  @type parser
  key_name log    # << Look for key `log` in event
  # ...
</filter>
{"from":"userA","to":"userB"}
{"log":"... your log here..."}