Logstash Filebeat仅获取部分csv文件
我已将filebeat配置为向logstash发送不同的VoIP/SMS csv文件。但是,只有VoIP.csv文件才能发送到logstash。 Csv文件位于不同的文件夹下Logstash Filebeat仅获取部分csv文件,logstash,pipeline,filebeat,elk,Logstash,Pipeline,Filebeat,Elk,我已将filebeat配置为向logstash发送不同的VoIP/SMS csv文件。但是,只有VoIP.csv文件才能发送到logstash。 Csv文件位于不同的文件夹下 logs/sms logs/voip 我还有一个问题。我通过在filebeat中为这些.csv创建标记,部分解决了这个问题 pwd /usr/share/filebeat/logs ls -ltr drwxr-xr-x 2 root root 106496 Dec 4 03:39 sms drwxr-xr-x 2 ro
logs/sms
logs/voip
我还有一个问题。我通过在filebeat中为这些.csv创建标记,部分解决了这个问题
pwd
/usr/share/filebeat/logs
ls -ltr
drwxr-xr-x 2 root root 106496 Dec 4 03:39 sms
drwxr-xr-x 2 root root 131072 Dec 8 01:49 voip
ls -ltr voip | head -4
-rw-r--r-- 1 root root 7933 Dec 4 03:39 sms_cdr_1010.csv
-rw-r--r-- 1 root root 7974 Dec 4 03:39 sms_cdr_101.csv
-rw-r--r-- 1 root root 7949 Dec 4 03:39 sms_cdr_1009.csv
ls -ltr voip | head -4
-rw-r--r-- 1 root root 11616 Dec 4 03:39 voip_cdr_10.csv
-rw-r--r-- 1 root root 11533 Dec 4 03:39 voip_cdr_1.csv
-rw-r--r-- 1 root root 11368 Dec 4 03:39 voip_cdr_0.csv
Filebeat仅开始收集voip.csv
2019-12-08T02:37:18.872Z INFO crawler/crawler.go:72 Loading Inputs: 1
2019-12-08T02:37:18.872Z INFO log/input.go:138 Configured paths: [/usr/share/filebeat/logs/voip/*]
2019-12-08T02:37:18.872Z INFO input/input.go:114 Starting input of type: log; ID: 801046369164835837
2019-12-08T02:37:18.872Z INFO crawler/crawler.go:106 Loading and starting Inputs completed. Enabled inputs: 1
2019-12-08T02:37:18.977Z INFO log/harvester.go:255 Harvester started for file: /usr/share/filebeat/logs/voip/voip_cdr_185.csv
2019-12-08T02:37:18.978Z INFO log/harvester.go:255 Harvester started for file: /usr/share/filebeat/logs/voip/voip_cdr_2809.csv
2019-12-08T02:37:18.979Z INFO log/harvester.go:255 Harvester started for file: /usr/share/filebeat/logs/voip/voip_cdr_2847.csv
filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- logs/sms/*
tags: ["sms"]
paths:
- logs/voip/*
tags: ["voip"]
output.logstash:
enabled: true
hosts: ["logstash:5044"]
logging.to_files: true
logging.files:
logstash.conf
input {
beats {
port => "5044"
}
}
filter {
if "sms" in [tags] {
csv {
columns => ['Date', 'Time', 'PLAN', 'CALL_TYPE', 'MSIDN', 'IMSI', 'IMEI']
separator => ","
skip_empty_columns => true
quote_char => "'"
}
}
if "voip" in [tags] {
csv {
columns => ['Record_Nb', 'Date', 'Time', 'PostDialDelay', 'Disconnect-Cause', 'Sip-Status','Session-Disposition', 'Calling-RTP-Packets-Lost','Called-RTP-Packets-Lost', 'Calling-RTP-Avg-Jitter','Called-RTP-Avg-Jitter', 'Calling-R-Factor', 'Called-R-Factor', 'Calling-MOS', 'Called-MOS', 'Ingress-SBC', 'Egress-SBC', 'Originating-Trunk-Group', 'Terminating-Trunk-Group']
separator => ","
skip_empty_columns => true
quote_char => "'"
}
}
}
output {
if "sms" in [tags] {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "smscdr_index"
}
stdout {
codec => rubydebug
}
}
if "voip" in [tags] {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "voipcdr_index"
}
stdout {
codec => rubydebug
}
}
}
试试下面的配置
filebeat.inputs:
- type: log
enabled: true
paths:
- /usr/share/filebeat/logs/sms/*.csv
tags: ["sms"]
paths:
- /usr/share/filebeat/logs/voip/*.csv
tags: ["voip"]
output.logstash:
enabled: true
hosts: ["logstash:5044"]
logging.to_files: true
logging.files:
修复了it解决方案[here]filebeat.inputs:-type:log-enabled:true-paths:-logs/sms/*.csv标记:[sms]-type:log-enabled:true-paths:-logs/voip/*.csv标记:[voip]您好,谢谢它能工作,我已经找到了解决方案,在上面的评论中,然而,我不知道如何将我的答案格式化为块代码格式:。。。谢谢