Performance 如何减少日志存储内存的使用
我正在使用Logstash-5.6.5(在Windows中)在独立系统(无云或群集)中运行。计划查看一些日志文件并将其发布到本地运行elasticsearch。但是当检查Logstash的内存使用情况时,没有配置来查看任何文件,它显示了大约600MB的内存使用情况。当我进一步添加输入文件管道配置时,它会进一步增加内存(对于观看3个日志文件,它增加了70MB,但我计划添加最多20个日志)。Performance 如何减少日志存储内存的使用,performance,memory-management,logstash,logstash-configuration,logstash-file,Performance,Memory Management,Logstash,Logstash Configuration,Logstash File,我正在使用Logstash-5.6.5(在Windows中)在独立系统(无云或群集)中运行。计划查看一些日志文件并将其发布到本地运行elasticsearch。但是当检查Logstash的内存使用情况时,没有配置来查看任何文件,它显示了大约600MB的内存使用情况。当我进一步添加输入文件管道配置时,它会进一步增加内存(对于观看3个日志文件,它增加了70MB,但我计划添加最多20个日志)。 一,。这是预期的行为吗? 二,。有没有办法通过logstash来减少巨大的内存使用量?经过几天的研究,下面是
一,。这是预期的行为吗?
二,。有没有办法通过logstash来减少巨大的内存使用量?经过几天的研究,下面是我对问题的答案 以下是优化日志存储内存的方法:
- ignore_older(秒)-完全忽略任何超过给定秒数的文件
- max_open_files(以数量表示)以优化打开的文件的最大数量
- close_older-以秒为单位关闭旧文件
- 排除-不需要的文件名数组(带或不带通配符)
input {
file {
#The application log path that will match with the rolling logs.
path => "c:/path/to/log/app-1.0-*.log"
#I didn't want logs older than an hour.
#If that older file gets updated with a new entry
#that will become the new file and the new entry will be read by Logstash
ignore_older => 3600
#I wanted to have only the very recent files to be watched.
#Since I am aware there won't be more then 5 files I set it to 5.
max_open_files => 5
#If the log file is not updated for 5 minutes close it.
#If any new entry gets added then it will be opened again.
close_older => 300
}
}
input {
file {
#The application log path that will match with the rolling logs.
path => "c:/path/to/log/app-1.0-*.log"
#I didn't want logs older than an hour.
#If that older file gets updated with a new entry
#that will become the new file and the new entry will be read by Logstash
ignore_older => 3600
#I wanted to have only the very recent files to be watched.
#Since I am aware there won't be more then 5 files I set it to 5.
max_open_files => 5
#If the log file is not updated for 5 minutes close it.
#If any new entry gets added then it will be opened again.
close_older => 300
}
}