Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/310.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 芹菜任务未被处理_Python_Rabbitmq_Task_Celery_Broker - Fatal编程技术网

Python 芹菜任务未被处理

Python 芹菜任务未被处理,python,rabbitmq,task,celery,broker,Python,Rabbitmq,Task,Celery,Broker,我试着用芹菜处理一些任务,但我运气不太好。我正在运行celeryd和celerybeat作为守护进程。我有一个tasks.py文件,看起来像这样,其中定义了一个简单的应用程序和任务: from celery import Celery app = Celery('tasks', broker='amqp://user:pass@hostname:5672/vhostname') @app.task def process_file(f): # do some stuff #

我试着用芹菜处理一些任务,但我运气不太好。我正在运行celeryd和celerybeat作为守护进程。我有一个
tasks.py
文件,看起来像这样,其中定义了一个简单的应用程序和任务:

from celery import Celery

app = Celery('tasks', broker='amqp://user:pass@hostname:5672/vhostname')

@app.task
def process_file(f):
    # do some stuff
    # and log results
此文件引用自另一个文件
process.py
,我用于监视文件更改,如下所示:

from tasks import process_file

file_name = '/file/to/process'
result = process_file.delay(file_name)
result.get()
CELERY_BIN="/usr/local/bin/celery"
CELERYBEAT_CHDIR="/opt/dirwithpyfiles"
CELERYBEAT_OPTS="--schedule=/var/run/celery/celerybeat-schedule"
有了这些代码,芹菜就无法看到任务并处理它们了。我可以在python解释器中执行类似的代码,芹菜会处理它们:

py >>> from tasks import process_file
py >>> process_file.delay('/file/to/process')
<AsyncResult: 8af23a4e-3f26-469c-8eee-e646b9d28c7b>
/etc/default/celeryd

CELERYD_NODES="worker1"
CELERY_BIN="/usr/local/bin/celery"
CELERYD_CHDIR="/opt/dirwithpyfiles"
CELERYD_OPTS="--time-limit=300 --concurrency=8"
CELERYD_USER="celery"
CELERYD_GROUP="celery"
CELERYD_LOG_FILE="/var/log/celery/%N.log"
CELERYD_PID_FILE="/var/run/celery/%N.pid"
CELERY_CREATE_DIRS=1

因此,我通过从cli而不是作为守护进程运行芹菜来解决我的问题,使我能够看到发生的错误的更详细输出。我是通过跑步来做到这一点的:

user@hostname   /opt/dirwithpyfiles $ su celery
celery@hostname /opt/dirwithpyfiles $ celery -A tasks worker --loglevel=info
在那里,我可以看到作为
Cellery
用户出现了权限问题,而当我以普通用户的身份运行python解释器中的命令时,权限问题并没有发生。我通过更改
/file/to/process
的权限来解决这个问题,这样两个用户都可以从中读取