Python Django+;芹菜+;监管者+;设置时出现Redis错误
我正在CentOS服务器上完成以下组件的设置。我得到了supervisord任务来启动和运行网站,但我被阻止设置芹菜的监督员。它似乎可以识别任务,但当我尝试执行任务时,它不会连接到它们。我的redis正在6380端口上运行Python Django+;芹菜+;监管者+;设置时出现Redis错误,python,django,redis,celery,supervisord,Python,Django,Redis,Celery,Supervisord,我正在CentOS服务器上完成以下组件的设置。我得到了supervisord任务来启动和运行网站,但我被阻止设置芹菜的监督员。它似乎可以识别任务,但当我尝试执行任务时,它不会连接到它们。我的redis正在6380端口上运行 Django==1.10.3 amqp==1.4.9 billiard==3.3.0.23 celery==3.1.25 kombu==3.0.37 pytz==2016.10 我的芹菜 [program:celeryd] command=/root/myproject/m
Django==1.10.3
amqp==1.4.9
billiard==3.3.0.23
celery==3.1.25
kombu==3.0.37
pytz==2016.10
我的芹菜
[program:celeryd]
command=/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO
environment=PATH="/root/myproject/myprojectenv/bin/",VIRTUAL_ENV="/root/myproject/myprojectenv",PYTHONPATH="/root/myproject/myprojectenv/lib/python2.7:/root/myproject/myprojectenv/lib/python2.7/site-packages"
directory=/home/.../myapp/
user=nobody
numprocs=1
stdout_logfile=/home/.../myapp/log_celery/worker.log
sterr_logfile=/home/.../myapp/log_celery/worker.log
autostart=true
autorestart=true
startsecs=10
; Need to wait for currently executing tasks to finish at shutdown.
; Increase this if you have very long running tasks.
stopwaitsecs = 1200
; When resorting to send SIGKILL to the program to terminate it
; send SIGKILL to its whole process group instead,
; taking care of its children as well.
killasgroup=true
; Set Celery priority higher than default (999)
; so, if rabbitmq(redis) is supervised, it will start first.
priority=1000
该过程开始,当我转到项目文件夹并执行以下操作时:
>python manage.py celery status
celery@ssd-1v: OK
1 node online.
当我打开芹菜的日志文件时,我看到任务已加载
[tasks]
. mb.tasks.add
. mb.tasks.update_search_index
. orders.tasks.order_created
我的mb/tasks.py
from mb.celeryapp import app
import django
django.setup()
@app.task
def add(x, y):
print(x+y)
return x + y
my mb/celeryapp.py
from __future__ import absolute_import, unicode_literals
import os
from celery import Celery
from django.conf import settings
# set the default Django settings module for the 'celery' program.
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "mb.settings")
app = Celery('mb', broker='redis://localhost:6380/', backend='redis://localhost:6380/')
app.conf.broker_url = 'redis://localhost:6380/0'
app.conf.result_backend = 'redis://localhost:6380/'
app.conf.timezone = 'Europe/Sofia'
app.config_from_object('django.conf:settings')
app.autodiscover_tasks(lambda: settings.INSTALLED_APPS)
my mb/settings.py:
...
WSGI_APPLICATION = 'mb.wsgi.application'
BROKER_URL = 'redis://localhost:6380/0'
CELERYBEAT_SCHEDULER = 'djcelery.schedulers.DatabaseScheduler'
STATICFILES_STORAGE = 'whitenoise.storage.CompressedManifestStaticFilesStorage'
...
当我跑步时:
python manage.py shell
>>> from mb.tasks import add
>>> add.name
'mb.tasks.add'
>>> result=add.delay(1,1)
>>> result.ready()
False
>>> result.status
'PENDING'
正如前面提到的,我在日志中再也看不到任何变化。
如果尝试从命令行运行:
/root/myproject/myprojectenv/bin/celery worker -A mb --loglevel=INFO
Running a worker with superuser privileges when the
worker accepts messages serialized with pickle is a very bad idea!
If you really want to continue then you have to set the C_FORCE_ROOT
environment variable (but please think about this before you do).
User information: uid=0 euid=0 gid=0 egid=0
但我想这是正常的,因为我在用户无人的情况下运行它。有趣的是命令just芹菜status(不带python manage.py芹菜status)在连接时出错,可能是因为它正在为redis寻找不同的端口,但是supervisord的过程正常启动。。。当我叫“芹菜工人-A mb”时,它说没关系。有什么想法吗
(myprojectenv) [root@ssd-1v]# celery status
Traceback (most recent call last):
File "/root/myproject/myprojectenv/bin/celery", line 11, in <module>
sys.exit(main())
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/__main__.py", line 3
0, in main
main()
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
81, in main
cmd.execute_from_commandline(argv)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
793, in execute_from_commandline
super(CeleryCommand, self).execute_from_commandline(argv)))
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
11, in execute_from_commandline
return self.handle_argv(self.prog_name, argv[1:])
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
785, in handle_argv
return self.execute(command, argv)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
717, in execute
).run_from_argv(self.prog_name, argv[1:], command=argv[0])
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
15, in run_from_argv
sys.argv if argv is None else argv, command)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 3
77, in handle_argv
return self(*args, **options)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/base.py", line 2
74, in __call__
ret = self.run(*args, **kwargs)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
473, in run
replies = I.run('ping', **kwargs)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
325, in run
return self.do_call_method(args, **kwargs)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/bin/celery.py", line
347, in do_call_method
return getattr(i, method)(*args)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 100, in ping
return self._request('ping')
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 71, in _request
timeout=self.timeout, reply=True,
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/celery/app/control.py", line 316, in broadcast
limit, callback, channel=channel,
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/pidbox.py", line 283, in _broadcast
chan = channel or self.connection.default_channel
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 771, in default_channel
self.connection
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 756, in connection
self._connection = self._establish_connection()
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/connection.py", line 711, in _establish_connection
conn = self.transport.establish_connection()
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/kombu/transport/pyamqp.py", line 116, in establish_connection
conn = self.Connection(**opts)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 165, in __init__
self.transport = self.Transport(host, connect_timeout, ssl)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/connection.py", line 186, in Transport
return create_transport(host, connect_timeout, ssl)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 299, in create_transport
return TCPTransport(host, connect_timeout)
File "/root/myproject/myprojectenv/lib/python2.7/site-packages/amqp/transport.py", line 95, in __init__
raise socket.error(last_err)
socket.error: [Errno 111] Connection refused
好的,本例中的答案是gunicorn文件实际上是从公共python库启动项目,而不是虚拟环境。好的,本例中的答案是gunicorn文件实际上是从公共python库启动项目,您可能应该使用
芹菜状态-mb
来代替虚拟环境,检查设置。当您自动发现任务时,已安装的应用程序包含哪些内容。在设置django\u settings\u模块
env变量之前,请导入django.settings
,这可能是个问题。我不知道到底出了什么问题。您可以尝试运行官方的django示例,并将其调整为在项目的文件夹中使用redis(或使用rabbitmq作为代理):芹菜状态-amb*给出ok*celery@ssd-1v:OK尝试删除settings.INSTALLED_应用程序并删除django.settings的导入,但仍然没有更改:/n您可能应该使用芹菜状态-A mb
检查设置。当您自动发现任务时,已安装的应用程序包含哪些内容。在设置django\u settings\u模块
env变量之前,请导入django.settings
,这可能是个问题。我不知道到底出了什么问题。您可以尝试运行官方的django示例,并将其调整为在项目的文件夹中使用redis(或使用rabbitmq作为代理):芹菜状态-amb*给出ok*celery@ssd-1v:OK尝试删除settings.INSTALLED_应用程序并删除django.settings的导入,但仍然没有更改:/
$:python manage.py shell
>>from mb.tasks import add
>>add
<@task: mb.tasks.add of mb:0x**2b3f6d0**>
[config]
- ** ---------- .> app: mb:0x3495bd0
- ** ---------- .> transport: redis://localhost:6380/0
- ** ---------- .> results: disabled://
- *** --- * --- .> concurrency: 1 (prefork)