Django-芹菜与RabbitMQ:任务始终处于挂起状态

Django-芹菜与RabbitMQ:任务始终处于挂起状态,django,asynchronous,rabbitmq,celery,Django,Asynchronous,Rabbitmq,Celery,我必须使用芹菜4.0.2和RabbitMQ 3.6.10来处理异步任务。然后,我学习了本教程: 然而,我的任务有一个小问题,因为不可能有结果。我的任务始终处于“挂起”状态 我的问题是我必须做什么才能得到结果 提前感谢您的回答 这里是我的代码: >>> from blog.tasks import * >>> job = add.delay(2,3) >>> job.state 'PENDING' >>> job.result

我必须使用芹菜4.0.2和RabbitMQ 3.6.10来处理异步任务。然后,我学习了本教程:

然而,我的任务有一个小问题,因为不可能有结果。我的任务始终处于“挂起”状态

我的问题是我必须做什么才能得到结果

提前感谢您的回答

这里是我的代码:

>>> from blog.tasks import *
>>> job = add.delay(2,3)
>>> job.state
'PENDING'
>>> job.result
>>>
这里是我的_uinit_uuuy.py:

from __future__ import absolute_import, unicode_literals

# This will make sure the app is always imported when
# Django starts so that shared_task will use this app.
from .celery import app as celery_app

__all__ = ['celery_app']
这里是我的setting.py的一部分:

BROKER_URL = 'amqp://guest:guest@localhost//'
CELERY_ACCEPT_CONTENT = ['json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
这是我的芹菜.py:

from __future__ import absolute_import, unicode_literals
import os
from celery import Celery

# set the default Django settings module for the 'celery' program.
os.environ.setdefault('DJANGO_SETTINGS_MODULE', 'mysite.settings')

app = Celery('mysite',
    backend='amqp',
    broker='amqp://guest@localhost//')

# Using a string here means the worker don't have to serialize
# the configuration object to child processes.
# - namespace='CELERY' means all celery-related configuration keys
#   should have a `CELERY_` prefix.
app.config_from_object('django.conf:settings', namespace='CELERY')

# Load task modules from all registered Django app configs.
app.autodiscover_tasks()


@app.task(bind=True)
def debug_task(self):
    print('Request: {0!r}'.format(self.request))
还有我的任务.py

# Create your tasks here
from __future__ import absolute_import, unicode_literals
from celery import shared_task


@shared_task
def add(x, y):
    test = "ok"
    current_task.update_state(state='PROGRESS',
        meta={'test': ok})
    return x + y
这里是我的Django Shell:

>>> from blog.tasks import *
>>> job = add.delay(2,3)
>>> job.state
'PENDING'
>>> job.result
>>>
带我的RabbitMQ接口的图片:

您需要启动一个工作进程来处理添加到队列中的任务。从您的virtualenv跑步:

celery worker -A blog -l info

感谢您的快速回答,请参见下文。