使用logstash(ELK堆栈)解析json
我创建了一个简单的json,如下所示使用logstash(ELK堆栈)解析json,logstash,elastic-stack,filebeat,logz.io,Logstash,Elastic Stack,Filebeat,Logz.io,我创建了一个简单的json,如下所示 [ { "Name": "vishnu", "ID": 1 }, { "Name": "vishnu", "ID": 1 } ] 我将这些值保存在名为simple.txt的文件中。然后,我使用file beat来监听该文件并将新的更新发送到端口5043,在另一端,我启动了日志隐藏服务,该服务监听该端口,以便解析json并将其传递给elastic search。
[
{
"Name": "vishnu",
"ID": 1
},
{
"Name": "vishnu",
"ID": 1
}
]
我将这些值保存在名为simple.txt的文件中。然后,我使用file beat来监听该文件并将新的更新发送到端口5043,在另一端,我启动了日志隐藏服务,该服务监听该端口,以便解析json并将其传递给elastic search。
日志栈不处理JSON值,它挂在中间。
logstash
input {
beats {
port => 5043
host => "0.0.0.0"
client_inactivity_timeout => 3600
}
}
filter {
json {
source => "message"
}
}
output {
stdout { codec => rubydebug }
}
文件节拍配置:
filebeat.prospectors:
- input_type: log
paths:
- filepath
output.logstash:
hosts: ["localhost:5043"]
日志存储输出
**
**
每次我使用命令运行日志存储时
logstash -f logstash.conf
由于没有对json的处理,我通过按ctrl+c停止该服务
请帮我找到解决方案。提前谢谢。最后我得到了这样的配置。它对我很有用
input
{
file
{
codec => multiline
{
pattern => '^\{'
negate => true
what => previous
}
path => "D:\elasticdb\logstash-tutorial.log\Test.txt"
start_position => "beginning"
sincedb_path => "D:\elasticdb\logstash-tutorial.log\null"
exclude => "*.gz"
}
}
filter {
json {
source => "message"
remove_field => ["path","@timestamp","@version","host","message"]
}
}
output {
elasticsearch { hosts => ["localhost"]
index => "logs"
"document_type" => "json_from_logstash_attempt3"
}
stdout{}
}
Json格式:
{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}
也许filebeat已经读取了文件并发送了数据?(如果您不断地向文件中添加数据,则不是问题所在)。
{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}
{"name":"sachin","ID":"1","TS":1351146569}