logstash挂起,错误大小为\u队列\u超时

logstash挂起,错误大小为\u队列\u超时,logstash,logstash-forwarder,Logstash,Logstash Forwarder,我们有一个logstash管道,其中许多logstash转发器将日志转发到单个logstash实例。我们多次观察到logstash挂起时出现以下错误:- [2016-07-22 03:01:12.619] WARN -- Concurrent::Condition: [DEPRECATED] Will be replaced with Synchronization::Object in v1.0. called on: /opt/logstash-1.5.3/vendor/bundle/jr

我们有一个logstash管道,其中许多logstash转发器将日志转发到单个logstash实例。我们多次观察到logstash挂起时出现以下错误:-

[2016-07-22 03:01:12.619]  WARN -- Concurrent::Condition: [DEPRECATED] Will be replaced with Synchronization::Object in v1.0.
called on: /opt/logstash-1.5.3/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-1.0.2/lib/logstash/sized_queue_timeout.rb:16:in `initialize'
Exception in thread ">output" java.lang.UnsupportedOperationException
        at java.lang.Thread.stop(Thread.java:869)
        at org.jruby.RubyThread.exceptionRaised(RubyThread.java:1221)
        at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:112)
        at java.lang.Thread.run(Thread.java:745)
我们的日志存储配置如下所示:-

input {
 lumberjack {
  port => 6782
  codec => json {}
  ssl_certificate => "/opt/logstash-1.5.3/cert/logstash-forwarder.crt"
  ssl_key => "/opt/logstash-1.5.3/cert/logstash-forwarder.key"
  type => "lumberjack"
 }
}

filter {
  if [env] != "prod" and [env] != "common" {
    drop {}
  }
  if [message] =~ /^\s*$/ {
    drop { }
  }
}

output {
  if "_jsonparsefailure" in [tags] {
    file {
      path => "/var/log/shop/parse_error/%{env}/%{app}/%{app}_%{host}_%{+YYYY-MM-dd}.log"
    }
  } else {
    kafka {
      broker_list => ["kafka:9092"]
      topic_id => "logstash_logs2"
    }
  }
}

重新启动日志存储后,它将再次开始工作。有人能告诉我为什么会出现这个问题,我们如何在不每次重新启动logstash的情况下解决这个问题?

我强烈建议切换到Filebeat,因为logstash转发器已不再维护。这可能会解决伐木工人相关的问题。我强烈建议切换到Filebeat,因为不再维护logstash转发器。这可能会解决与伐木工人相关的问题。