Warning: file_get_contents(/data/phpspider/zhask/data//catemap/7/python-2.7/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 链接错误,如芹菜链_Python_Python 2.7_Celery - Fatal编程技术网

Python 链接错误,如芹菜链

Python 链接错误,如芹菜链,python,python-2.7,celery,Python,Python 2.7,Celery,我有一个函数,可以启动一个示例作业并相应地更新数据库。我的功能是这样的: def schedule_sample_job(hosts): parent_job_id = 'SampleBatchJob|%s|Phantom' % uuid.uuid1() child_jobs = [] host_entries = [] for host in hosts: job_id = 'SampleJob|%s|Phantom

我有一个函数,可以启动一个示例作业并相应地更新数据库。我的功能是这样的:

def schedule_sample_job(hosts):
       parent_job_id = 'SampleBatchJob|%s|Phantom' % uuid.uuid1()
       child_jobs = []
       host_entries = []
       for host in hosts:
           job_id = 'SampleJob|%s|Phantom' % uuid.uuid1()
           res = chain(
              add.si(2, 2),
              add.si(3, 3),
              throw_exception.si('4'),
              mul.si(4, 4),
              mul.si(5, 5)
           )
           capacity_task = res.apply_async(
               serializer='json',
               link=[change_job_status.s(job_id, 'SUCCESS')],
               link_error=[change_job_status.s(job_id, 'FAILED'), change_parent_job_status.s(parent_job_id)]
           )
           host_entry = {'host': host}
           host_entries.append(host_entry)
           celery_util = CeleryUtil()
           celery_util.store_chain_job(
               job_id,
               capacity_task,
               parent_job_id,
               'anonymous',
               [host_entry]
           )
           child_jobs.append(job_id)

      celery_util = CeleryUtil()
      celery_util.store_chain_job(parent_job_id, child_jobs, None, 'anonymous', host_entries)
      job_status = celery_util.build_job_status(parent_job_id)
      return job_status

正如您所看到的,我尝试为每个主机启动一个作业,并且在link_错误中有多个任务,我希望这像一个链,其中一个任务在另一个任务完成后执行。在我的程序中不是这样的。非常感谢您的帮助。

在启动辅助程序时,您可以将并发性指定为1,这样一次只能执行一个作业。哈哈,寻找解决方案显然不希望将并发性设置为1,因为这会大大限制我的性能。谢谢你的建议