Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/304.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 我能';不要让芹菜正常工作(aws elasticbeanstalk)_Python_Django_Celery_Django Celery_Amazon Elastic Beanstalk - Fatal编程技术网

Python 我能';不要让芹菜正常工作(aws elasticbeanstalk)

Python 我能';不要让芹菜正常工作(aws elasticbeanstalk),python,django,celery,django-celery,amazon-elastic-beanstalk,Python,Django,Celery,Django Celery,Amazon Elastic Beanstalk,我正在将django应用程序移动到python3/django 1.10。这个过程的一部分还包括一个新的部署,我们使用AWS EBS 芹菜任务在迁移之前还可以,但现在我无法让任务正常工作 套餐: ... celery==3.1.23 Django==1.10.6 django-celery==3.2.1 ... Python: Python 3.4.3 在supervisor配置中,我添加了一个运行芹菜的配置: [program:celeryd-workers] ; Set full pat

我正在将django应用程序移动到python3/django 1.10。这个过程的一部分还包括一个新的部署,我们使用AWS EBS

芹菜任务在迁移之前还可以,但现在我无法让任务正常工作

套餐:

...
celery==3.1.23
Django==1.10.6
django-celery==3.2.1
...
Python:

Python 3.4.3
在supervisor配置中,我添加了一个运行芹菜的配置:

[program:celeryd-workers]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
directory=/opt/python/current/app
user=nobody
numprocs=1
autostart=true
autorestart=true
startsecs=10
stopwaitsecs = 600
killasgroup=true    
stdout_logfile=/var/log/celery-worker.log
stderr_logfile=/var/log/celery-worker.log
environment=PYTHONPATH="/opt/python/current/app/:",PATH="/opt/python/run/venv/bin/:%(ENV_PATH)s",DJANGO_SETTINGS_MODULE="settings.qa"

[program:celeryd-beat]
; Set full path to celery program if using virtualenv
command=/opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery beat -A app --app=app.celery_app:app --loglevel=DEBUG --workdir=/tmp --pidfile=/tmp/celerybeat.pid -s /tmp/celerybeat-schedule.db
directory=/opt/python/current/app
user=nobody
numprocs=1
autostart=true
autorestart=true
startsecs=10
stopwaitsecs = 600
killasgroup=true

stdout_logfile=/var/log/celery-beat.log
stderr_logfile=/var/log/celery-beat.log
environment=PYTHONPATH="/opt/python/current/app/:",PATH="/opt/python/run/venv/bin/:%(ENV_PATH)s",DJANGO_SETTINGS_MODULE="settings.qa"
我的芹菜app.py非常简单:

from __future__ import unicode_literals, absolute_import

from celery import Celery
from django.conf import settings

app = Celery()

app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
配置

在EC2实例上,芹菜的设置:

BROKER_URL = 'the aws elastic cache redis url'
CELERY_RESULT_BACKEND = 'djcelery.backends.database:DatabaseBackend'

BROKER_TRANSPORT_OPTIONS = {
    'visibility_timeout': 600,
}

BROKER_POOL_LIMIT = 1

CELERY_ACCEPT_CONTENT = ['pickle', 'json', 'msgpack']

CELERY_DEFAULT_QUEUE = 'default'
CELERY_QUEUES = {
    'default': {
        'exchange': 'default',
        'exchange_type': 'topic',
        'binding_key': 'tasks.#'
    }
}

CELERY_ALWAYS_EAGER = True
从实例中,我检查了配置(从django.conf导入设置),并检查了实例是否可以使用
redis cli
连接到redis

什么不起作用? 基本上,如果我运行一个任务,即使是像芹菜文档中那样的最简单的任务,如
添加(x,y)
,我在
/manage.py芹菜事件中没有得到任务,我让事件打开几个小时,并尝试从应用程序运行任务,但没有发生任何事情,似乎卡住了

 No task selected                                                                                                                                                                                  │
│  Workers online: celery@ip-xx-xx-xx-xx                                                                                                                                                            │
│  Info: events: 2187 tasks:0 workers:1/1
│  Keys: j:down k:up i:info t:traceback r:result c:revoke ^c: quit
奇怪的是,如果我在应用程序中的文档中运行此任务:

In [1]: from app.core.tasks import add

In [2]: result = add.delay(2,2)

In [3]: result.get()
Out[3]: 4
结果出现了,但我在事件中看不到任何任务,如果我检查
芹菜检查统计数据

...
       "pool": {
            "max-concurrency": 4,
            "max-tasks-per-child": "N/A",
            "processes": [
                11363,
                11364,
                11365,
                11366
            ],
            "put-guarded-by-semaphore": false,
            "timeouts": [
                0,
                0
            ],
            "writes": {
                "all": "",
                "avg": "0.00%",
                "inqueues": {
                    "active": 0,
                    "total": 4
                },
                "raw": "",
                "total": 0
            }
        },
....
似乎什么都不管用

当然,过程是运行的:

[ec2-user@xxx]$ ps aux | grep "celery"
nobody   11350  0.0  3.7 251484 77408 ?        S    08:11   0:01 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery beat -A app --app=app.celery_app:app --loglevel=DEBUG --workdir=/tmp --pidfile=/tmp/celerybeat.pid -s /tmp/celerybeat-schedule.db
nobody   11351  0.1  3.8 247804 79848 ?        S    08:11   0:07 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
nobody   11363  0.0  3.4 243876 70956 ?        S    08:11   0:00 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
nobody   11364  0.0  3.4 243876 71024 ?        S    08:11   0:00 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
nobody   11365  0.0  3.4 243876 71024 ?        S    08:11   0:00 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
nobody   11366  0.0  3.4 243876 71024 ?        S    08:11   0:00 /opt/python/run/venv/bin/python /opt/python/current/app/manage.py celery worker -A app --app=app.celery_app:app -l DEBUG -c 4
以及日志:

[ec2-user@xxxx log]$ tail  /var/log/celery-worker.log
[2017-03-14 20:34:45,126: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:34:50,124: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:34:55,125: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:00,124: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:05,125: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:10,124: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:15,125: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:20,124: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:25,125: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
[2017-03-14 20:35:30,124: DEBUG/MainProcess] pidbox received method enable_events() [reply_to:None ticket:None]
芹菜节拍:

[ec2-user@xxxx log]$ tail  /var/log/celery-beat.log
>>>> Testing: False
celery beat v3.1.23 (Cipater) is starting.
__    -    ... __   -        _
Configuration ->
    . broker -> redis://qa-redis.xxxx:6379//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> /tmp/celerybeat-schedule.db
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> now (0s)
你知道我做错了什么吗?

你的设置文件
芹菜\u总是\u渴望=真
这基本上意味着它将跳过队列并在本地运行任务。因此,您在没有看到任何事件的情况下获得结果。 看看


我会尝试移除那个设置并从那里开始工作

这听起来不像是弹性beanstalk部署,因为除非您使用docker,否则我不明白如何在同一部署上同时运行服务器和工作程序。如果你真的想使用docker,那就改用docker吧。你找到问题所在了吗?我正在将python2迁移到python3,并且在运行芹菜时遇到同样的问题。@awwester芹菜还是芹菜节拍?