Ruby 从Bunny连接网络中营救时出现问题::ReadTimeout

Ruby 从Bunny连接网络中营救时出现问题::ReadTimeout,ruby,rabbitmq,bunny,Ruby,Rabbitmq,Bunny,我有一个ruby脚本,它使用Bunny Gem连接到rabbitmq实例。该脚本工作了一段时间,但最终会因为Net::ReadTimeout而消亡 E, [2017-08-13T08:48:09.671988 #21351] ERROR -- #<Bunny::Session:0x39eca20 scrapes@104.196.154.25:5672, vhost=/, addresses=[104.196.154.25:5672]>: Uncaught exception from

我有一个ruby脚本,它使用Bunny Gem连接到rabbitmq实例。该脚本工作了一段时间,但最终会因为Net::ReadTimeout而消亡

E, [2017-08-13T08:48:09.671988 #21351] ERROR -- #<Bunny::Session:0x39eca20 scrapes@104.196.154.25:5672, vhost=/, addresses=[104.196.154.25:5672]>: Uncaught exception from consumer #<Bunny::Consumer:32353120 @channel_id=1 @queue=sc_link_queue> @c
onsumer_tag=bunny-1502631967000-46739673895>: #<Net::ReadTimeout: Net::ReadTimeout> @ /home/rails/.rvm/rubies/ruby-2.3.3/lib/ruby/2.3.0/net/protocol.rb:158:in `rbuf_fill'
E, [2017-08-13T08:48:32.468023 #23205] ERROR -- #<Bunny::Session:0x42202a0 scrapes@104.196.154.25:5672, vhost=/, addresses=[104.196.154.25:5672]>: Uncaught exception from consumer #<Bunny::Consumer:36695920 @channel_id=1 @queue=sc_link_queue> @c
onsumer_tag=bunny-1502631972000-482787698591>: #<Net::ReadTimeout: Net::ReadTimeout> @ /home/rails/.rvm/rubies/ruby-2.3.3/lib/ruby/2.3.0/net/protocol.rb:158:in `rbuf_fill'

正如你所看到的,我正试图从几乎所有可能发生的事情中拯救出来。我似乎仍然无法让它从Net::ReadTimeout错误中恢复。一旦worker死亡,您仍然可以看到它已连接到rabbitmq,但它从队列中获取的最后一项未被确认,它基本上处于挂起状态。

我已经解决了这个问题。问题是在Bunny subscribe块中运行的所有内容都在不同的线程中处理,因此需要在该块中添加rescue语句

module Sc
  class Worker
    def initialize
      init()
    end

    def self.start_headless(type)
      Headless.new(display: 50, destroy_at_exit: false, resuse: true).start
      worker = new
      worker.send(type)
    end

    def init
      $conn ||= Bunny.new($rabbitmq_opts)
      $conn.start
      @browser = Sc::Browser.new()
    rescue Timeout::Error, Net::ReadTimeout, Selenium::WebDriver::Error::UnknownError, Errno::ECONNREFUSED, Selenium::WebDriver::Error::JavascriptError, Exception, StandardError => e
      LOGGER.error("[x] Trouble connecting to rabbitmq, retrying...")
      LOGGER.error("[x] #{e}")
      LOGGER.error("[x] #{e.backtrace}")
      retry
    end

    def listen_for_searches
      channel = $conn.create_channel
      channel.prefetch(1)
      queue = channel.queue($rabbitmq_search_queue, durable: true)
      exchange = channel.default_exchange
      queue.subscribe(:manual_ack => true, :block => true) do |delivery_info, properties, payload|
        LOGGER.info "[x] Received #{payload}"
        payload = JSON.parse(payload)
        scrape = Sc::Search.new(browser: @browser.browser, county: payload["name"], type: payload["type"], date_type: payload["date_type"])
        scrape.run
        scrape.close
        channel.ack(delivery_info.delivery_tag)
      end
    rescue Timeout::Error, Net::ReadTimeout, Selenium::WebDriver::Error::UnknownError, Errno::ECONNREFUSED, Selenium::WebDriver::Error::JavascriptError, Exception, StandardError => e
      LOGGER.error("[x] #{e}")
      LOGGER.error("[x] #{e.backtrace}")
      LOGGER.error("[x] Trouble with scrape, retrying...")
      retry
    end
  end
end