Python 3.x 为什么芹菜工人将任务置于挂起状态这么长时间?
我有一个芹菜工人运行tasks.py,如下所示:Python 3.x 为什么芹菜工人将任务置于挂起状态这么长时间?,python-3.x,redis,rabbitmq,celery,Python 3.x,Redis,Rabbitmq,Celery,我有一个芹菜工人运行tasks.py,如下所示: from celery import Celery from kombu import Connection, Exchange, Queue, Consumer import socket app = Celery('tasks', backend='redis://', broker='pyamqp://guest:guest@localhost/') app.conf.task_default_queue = 'default' app
from celery import Celery
from kombu import Connection, Exchange, Queue, Consumer
import socket
app = Celery('tasks', backend='redis://', broker='pyamqp://guest:guest@localhost/')
app.conf.task_default_queue = 'default'
app.conf.task_queues = (
Queue('queue1', routing_key='tasks.add'),
Queue('queueA', routing_key='tasks.task_1'),
Queue('queueB', routing_key='tasks.task_2'),
Queue('queueC', routing_key='tasks.task_3'),
Queue('queueD', routing_key='tasks.task_4')
)
@app.task
def add(x, y):
print ("add("+ str(x) + "+" + str(y) + ")")
return x + y
from celery import signature
from celery import chain
from tasks import *
signature('tasks.add', args=(2,2))
result = chain(add.s(2,2), add.s(4), add.s(8)).apply_async(queue='queue1')
print (result.status)
print (result.get())
以及一个tasks_canvas.py,它创建了一个任务链,如下所示:
from celery import Celery
from kombu import Connection, Exchange, Queue, Consumer
import socket
app = Celery('tasks', backend='redis://', broker='pyamqp://guest:guest@localhost/')
app.conf.task_default_queue = 'default'
app.conf.task_queues = (
Queue('queue1', routing_key='tasks.add'),
Queue('queueA', routing_key='tasks.task_1'),
Queue('queueB', routing_key='tasks.task_2'),
Queue('queueC', routing_key='tasks.task_3'),
Queue('queueD', routing_key='tasks.task_4')
)
@app.task
def add(x, y):
print ("add("+ str(x) + "+" + str(y) + ")")
return x + y
from celery import signature
from celery import chain
from tasks import *
signature('tasks.add', args=(2,2))
result = chain(add.s(2,2), add.s(4), add.s(8)).apply_async(queue='queue1')
print (result.status)
print (result.get())
但是,当运行tasks_canvas.py时,result.status始终处于挂起状态,并且worker从不运行整个链。以下是正在运行的tasks_canvas.py的输出:
C:\Users\user_\Desktop\Aida>tasks_canvas.py
PENDING
下面是工人的输出:
C:\Users\user_\Desktop\Aida>celery -A tasks worker -l info -P eventlet
-------------- celery@User-RazerBlade v4.2.0 (windowlicker)
---- **** -----
--- * *** * -- Windows-10-10.0.17134-SP0 2018-07-16 12:04:20
-- * - **** ---
- ** ---------- [config]
- ** ---------- .> app: tasks:0x41d5390
- ** ---------- .> transport: amqp://guest:**@localhost:5672//
- ** ---------- .> results: redis://
- *** --- * --- .> concurrency: 4 (eventlet)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
.> queue1 exchange=(direct) key=tasks.add
.> queueA exchange=(direct) key=tasks.task_1
.> queueB exchange=(direct) key=tasks.task_2
.> queueC exchange=(direct) key=tasks.task_3
.> queueD exchange=(direct) key=tasks.task_4
[tasks]
. tasks.add
. tasks.task_1
. tasks.task_2
. tasks.task_3
. tasks.task_4
. tasks.task_5
[2018-07-16 12:04:20,334: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2018-07-16 12:04:20,351: INFO/MainProcess] mingle: searching for neighbors
[2018-07-16 12:04:21,394: INFO/MainProcess] mingle: all alone
[2018-07-16 12:04:21,443: INFO/MainProcess] celery@User-RazerBlade ready.
[2018-07-16 12:04:21,448: INFO/MainProcess] pidbox: Connected to amqp://guest:**@127.0.0.1:5672//.
[2018-07-16 12:04:23,101: INFO/MainProcess] Received task: tasks.add[e6306b5b-211f-4015-b57e-05e2d0ac2df2]
[2018-07-16 12:04:23,102: WARNING/MainProcess] add(2+2)
[2018-07-16 12:04:23,128: INFO/MainProcess] Task tasks.add[e6306b5b-211f-4015-b57e-05e2d0ac2df2] succeeded in 0.031000000000858563s: 4
我想知道为什么会发生这种情况,因为我对芹菜不熟悉,我如何让工人来管理整个链条?我解决了这个问题。关于任务为什么总是处于挂起状态,有一个指南。但是,它并不涵盖所有情况。在我的例子中,有一个任务路由问题。当我使用默认队列时,链中的所有任务都会立即运行