Ruby on rails Sidekiq通过api调用获取有限数量数据的最佳方法是什么?
我是新来的。我的站点部署在heroku中 在我的代码中,我声明了Ruby on rails Sidekiq通过api调用获取有限数量数据的最佳方法是什么?,ruby-on-rails,ruby,sidekiq,Ruby On Rails,Ruby,Sidekiq,我是新来的。我的站点部署在heroku中 在我的代码中,我声明了config/sidekiq.yml # Place this file at config/sidekiq.yml and Sidekiq will # pick it up automatically. --- :verbose: false :concurrency: 20 # Set timeout to 8 on Heroku, longer if you manage your own systems. :timeou
config/sidekiq.yml
# Place this file at config/sidekiq.yml and Sidekiq will
# pick it up automatically.
---
:verbose: false
:concurrency: 20
# Set timeout to 8 on Heroku, longer if you manage your own systems.
:timeout: 30
# Sidekiq will run this file through ERB when reading it so you can
# even put in dynamic logic, like a host-specific queue.
# http://www.mikeperham.com/2013/11/13/advanced-sidekiq-host-specific-queues/
# you can override concurrency based on environment
production:
:concurrency: 25
staging:
:concurrency: 15
在我的workeer中,我已经从api获取了数据
我的程序文件是worker:bundle-exec-sidekiq-C config/sidekiq.yml
# Place this file at config/sidekiq.yml and Sidekiq will
# pick it up automatically.
---
:verbose: false
:concurrency: 20
# Set timeout to 8 on Heroku, longer if you manage your own systems.
:timeout: 30
# Sidekiq will run this file through ERB when reading it so you can
# even put in dynamic logic, like a host-specific queue.
# http://www.mikeperham.com/2013/11/13/advanced-sidekiq-host-specific-queues/
# you can override concurrency based on environment
production:
:concurrency: 25
staging:
:concurrency: 15
我的工人看起来像
class SfApifetch
include Sidekiq::Worker
sidekiq_options retry: false
def perform(name, count)
# do something
Net::Http...rest of the code
end
end
有谁能建议我如何以批量限制方式获取数据,就像我希望在第一页获取1000条记录,然后以这种方式获取1000条记录,这样既不会影响性能,又可以提高速度
我是sidekiq和BG的新手,任何建议都会对我有很大帮助
我必须遵循如果外部API提供了一些
id
或其他唯一字段,您可以将其保存到某个变量,并将其传递给下一个工作者调用,例如
class AwesomeWorker
LIMIT = 1000
include Sidekiq::Worker
def perform(name, count, last_id = nil)
# fetching data from last_id, limited by LIMIT
# fetching from the beginning if last_id is nil
# running worker with id of the last record
AwesomeWorker.perform_async(name, count, data.last_id)
end
end
你能把你的工人发出去吗worker@spickermann谢谢你对我的帖子感兴趣。。请建议如何使用1000、1000、1000对其进行优化,以更快地获取数据。这样,外部资源是否支持分页或批处理?否外部api也不支持分页和批处理