elasticsearch Elasticsearch已在停止日志存储时创建索引
在运行bin/logstash-f logstash.conf--debug时,它不会在Elasticsearch中创建索引。直到我停止logstash,然后它在ES中创建了一个索引 如果我添加函数“codec=>multiline”,但它需要使用codec=>multiline,就会出现问题 有人能帮我修复Logstash.conf吗elasticsearch Elasticsearch已在停止日志存储时创建索引,elasticsearch,logstash,logstash-configuration,elasticsearch,Logstash,Logstash Configuration,在运行bin/logstash-f logstash.conf--debug时,它不会在Elasticsearch中创建索引。直到我停止logstash,然后它在ES中创建了一个索引 如果我添加函数“codec=>multiline”,但它需要使用codec=>multiline,就会出现问题 有人能帮我修复Logstash.conf吗 input { file { type => "json-log" path => "/logfile/*
input {
file {
type => "json-log"
path => "/logfile/*.json"
start_position => "beginning"
sincedb_path => "/dev/null"
ignore_older => 0
codec => multiline {
pattern => "^\["
negate => "true"
what => "previous"
}
}
}
filter {
grok {
match => ["message", "%{GREEDYDATA:msg}"]
overwrite => [ "message" ]
}
json {
source => "msg"
}
}
output {
elasticsearch {
hosts => ["xxxxx.com:9206"]
index => "fpngilog-%{+YYYY-MM-dd}"
#document_type => "hola"
}
stdout { codec => rubydebug }
}
我认为这意味着多行编解码器正在匹配它接收的所有行,并在单个事件中折叠它们。由于所有行都是匹配的,因此它们“保留”在多行编解码器中,并且只有在停止logstash时才会刷新。作为一种解决方案,您可以添加该选项,或更改编解码器配置,使所有行都不匹配。谢谢您的建议。自动冲洗现在工作正常^_^