elasticsearch Logstash:值太大,无法输出,elasticsearch,jboss,logstash,elastic-stack,logstash-grok,elasticsearch,Jboss,Logstash,Elastic Stack,Logstash Grok" /> elasticsearch Logstash:值太大,无法输出,elasticsearch,jboss,logstash,elastic-stack,logstash-grok,elasticsearch,Jboss,Logstash,Elastic Stack,Logstash Grok" />

elasticsearch Logstash:值太大,无法输出

elasticsearch Logstash:值太大,无法输出,elasticsearch,jboss,logstash,elastic-stack,logstash-grok,elasticsearch,Jboss,Logstash,Elastic Stack,Logstash Grok,最近使用版本5.0.0-1构建的麋鹿堆栈 在使用1个多行筛选器搜索jboss日志时,我看到以下错误: [2016-11-14T19:48:48,802][ERROR][logstash.filters.grok ] Error while attempting to check/cancel excessively long grok patterns {:message=>"Mutex relocking by same thread", :class=>"ThreadErr

最近使用版本5.0.0-1构建的麋鹿堆栈

在使用1个多行筛选器搜索jboss日志时,我看到以下错误:

[2016-11-14T19:48:48,802][ERROR][logstash.filters.grok    ] Error while attempting to check/cancel excessively long grok patterns {:message=>"Mutex relocking by same thread", :class=>"ThreadError", :backtrace=>["org/jruby/ext/thread/Mutex.java:90:in `lock'", "org/jruby/ext/thread/Mutex.java:147:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:38:in `stop_thread_groking'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:53:in `cancel_timed_out!'", "org/jruby/RubyHash.java:1342:in `each'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:45:in `cancel_timed_out!'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:44:in `cancel_timed_out!'", "/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:63:in `start!'"]}
[2016-11-14T19:48:48,802][WARN ][logstash.filters.grok    ] Timeout executing grok '%{DATA:prefixofMessage}<tXML>%{DATA:orderXML}</tXML>' against field 'message' with value 'Value too large to output (27191 bytes)! First 255 chars are: 2016-10-30 23:28:02,193 INFO  [nucleusNamespace.com.NAMESPACEREDACTED.NAMESPACEREDACTED.NAMESPACEREDACTED] (ajp-IPADDRESSREDACTED-PORTREDACTED-325) DEBUG  NAMEREDACTED | order xml ----------- <?xml version="1.0" encoding="UTF-8" standalone="yes"?>
[2016-11-14T19:48:48802][ERROR][logstash.filters.grok]尝试检查/取消过长的grok模式时出错{:message=>“同一线程重新锁定互斥体”,:class=>“ThreadError”,:backtrace=>[“org/jruby/ext/thread/Mutex.java:90:in`lock',“org/jruby/ext/thread/Mutex.java:147:in`synchronize'”,“/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout\u enforcer.rb:38:in`stop\u thread\u groking'”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/grok/timeout\u enforcer.rb:53:in`cancel\u timed\u out!',“/org/jruby:1342,”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout\u enforcer.rb:45:in`cancel\u timed\u out!',“org/jruby/ext/thread/Mutex.java:149:in`synchronize'”/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/timeout_enforcer.rb:44:in`cancel_timed_out!',“/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-grok-3.2.3/lib/logstash/filters/grok/grok/timeout_enforcer.rb:63:in`start!']”
[2016-11-14T19:48:48802][WARN][logstash.filters.grok]对“值太大而无法输出(27191字节)”的字段“message”执行grok“{DATA:prefixofMessage}%{DATA:orderXML}”超时!前255个字符是:2016-10-30 23:28:02193信息[nucleusNamespace.com.namespacedacted.namespacedacted.namespacedacted.namespacedacted](ajp-IPADDRESSREDACTED-PORTREDACTED-325)调试名称Redacted |顺序xml--------
同样的过滤器在2.4中运行得很好,但是现在在5.0.0-1上运行同样的过滤器,我看到了这一点


有人在这个版本的ELK stack中看到过这个吗?

这在中得到了修复。你可以现在升级插件,或者等待Logstash 5.0.1

Logstash grok repo中也有一个类似的开放问题:我正在使用Logstash 5.0.1,但仍然收到grok timeout/值太大的错误。@ZianyD这可能是因为你的正则表达式错了。这个错误应该发生在ReDoS regexps上。在上面的例子中,这是一个实际的错误,但是您没有提供足够的上下文来确定这一点。