Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/powershell/12.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Logstash HTTPD_组合日志未定义错误_Logstash_Elastic Stack_Logstash Grok - Fatal编程技术网

Logstash HTTPD_组合日志未定义错误

Logstash HTTPD_组合日志未定义错误,logstash,elastic-stack,logstash-grok,Logstash,Elastic Stack,Logstash Grok,启动apache组合日志筛选器的日志存储时出错 配置文件: input { file { path => "/u/agrawalo/logstash-5.4.0/event-data/apache_access.log" start_position => "beginning" } http { } } filter { grok {

启动apache组合日志筛选器的日志存储时出错

配置文件:

input {
        file {
            path => "/u/agrawalo/logstash-5.4.0/event-data/apache_access.log"
            start_position => "beginning"
        }

        http {
        }
}

filter {
        grok {
           match => { "message" => "%{HTTPD_COMBINEDLOG}" }
        }
}

output {
        stdout {
            codec => rubydebug
        }
}
用于启动日志存储的命令:

bin/logstash -f config/pipelines/apacheauto.conf --config.reload.automatic
错误:

Sending Logstash's logs to /u/agrawalo/logstash-5.4.0/logs which is now configured via log4j2.properties
04:18:45.723 [[main]-pipeline-manager] ERROR logstash.pipeline - Error registering plugin {:plugin=>"#<LogStash::FilterDelegator:0x7bfa005e @id=\"498367beab653b0a3133b16fc4dcef59f08886de-3\", @klass=LogStash::Filters::Grok, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x684a02d @metric=#<LogStash::Instrument::Metric:0x68e13c68 @collector=#<LogStash::Instrument::Collector:0x7fe7de03 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x5434c951 @store=#<Concurrent::Map:0x77929e32 @default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x16f1fed4>, @fast_lookup=#<Concurrent::Map:0x57273dcf @default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :filters, :\"498367beab653b0a3133b16fc4dcef59f08886de-3\", :events]>, @logger=#<LogStash::Logging::Logger:0x462b61a2 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x4941bd9c>>, @filter=<LogStash::Filters::Grok match=>{\"message\"=>\"%{HTTPD_COMBINEDLOG}\"}, id=>\"498367beab653b0a3133b16fc4dcef59f08886de-3\", enable_metric=>true, periodic_flush=>false, patterns_files_glob=>\"*\", break_on_match=>true, named_captures_only=>true, keep_empty_captures=>false, tag_on_failure=>[\"_grokparsefailure\"], timeout_millis=>30000, tag_on_timeout=>\"_groktimeout\">>", :error=>"pattern %{HTTPD_COMBINEDLOG} not defined"}
04:18:45.731 [[main]-pipeline-manager] ERROR logstash.agent - Pipeline aborted due to error {:exception=>#<Grok::PatternError: pattern %{HTTPD_COMBINEDLOG} not defined>, :backtrace=>["/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:123:in `compile'", "org/jruby/RubyKernel.java:1479:in `loop'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/jls-grok-0.11.4/lib/grok-pure.rb:93:in `compile'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:274:in `register'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:269:in `register'", "org/jruby/RubyHash.java:1342:in `each'", "/u/agrawalo/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.3.1/lib/logstash/filters/grok.rb:264:in `register'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:268:in `register_plugin'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "org/jruby/RubyArray.java:1613:in `each'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:279:in `register_plugins'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:289:in `start_workers'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/pipeline.rb:214:in `run'", "/u/agrawalo/logstash-5.4.0/logstash-core/lib/logstash/agent.rb:398:in `start_pipeline'"]}
04:18:46.405 [Api Webserver] INFO  logstash.agent - Successfully started Logstash API endpoint {:port=>9600}
进一步调试后,我发现缺少httpd模式:

agrawalo@abc:~/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns> ls
aws  bacula  bro  exim  firewalls  grok-patterns  haproxy  java  junos  linux-syslog  mcollective  mcollective-patterns  mongodb  nagios  postgresql  rails  redis  ruby
问题:

  • 为什么会缺少这种模式

  • 如何在现有的logstash安装中包括或安装此模式


  • 我可以通过更新logstash的版本来解决这个问题

    agrawalo@abc:~/logstash-5.4.0/vendor/bundle/jruby/1.9/gems/logstash-patterns-core-4.0.2/patterns> ls
    aws  bacula  bro  exim  firewalls  grok-patterns  haproxy  java  junos  linux-syslog  mcollective  mcollective-patterns  mongodb  nagios  postgresql  rails  redis  ruby