Django Model.objects.all()返回芹菜任务中的空QuerySet

Django Model.objects.all()返回芹菜任务中的空QuerySet,django,docker-compose,celery,django-orm,Django,Docker Compose,Celery,Django Orm,project/project/settings.py ... CELERY_BEAT_SCHEDULE = { 'find-subdomains': { 'task': 'subdiscovery.tasks.mytask', 'schedule': 10.0 } } 项目/子目录/tasks.py from __future__ import absolute_import, unicode_literals from celery import shared_t

project/project/settings.py

...

CELERY_BEAT_SCHEDULE = {
  'find-subdomains': {
    'task': 'subdiscovery.tasks.mytask',
    'schedule': 10.0
  }
}
项目/子目录/tasks.py

from __future__ import absolute_import, unicode_literals
from celery import shared_task

from subdiscovery.models import Domain

@shared_task
def mytask():
    print(Domain.objects.all())

    return 99
芹菜工人显示一个空查询集:

celery_worker_1  | [2019-08-12 07:07:44,229: WARNING/ForkPoolWorker-2] <QuerySet []>
celery_worker_1  | [2019-08-12 07:07:44,229: INFO/ForkPoolWorker-2] Task subdiscovery.tasks.mytask[60c59024-cd19-4ce9-ae69-782a3a81351b] succeeded in 0.004897953000181587s: 99
按预期工作:

[2019-08-13 05:12:28,945: INFO/MainProcess] Received task: subdiscovery.tasks.mytask[7b2760cf-1e7f-41f8-bc13-fa4042eedf33]  
[2019-08-13 05:12:28,957: WARNING/ForkPoolWorker-8] <QuerySet [<Domain: uber.com>, <Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
你知道为什么工人从docker compose开始不起作用,但进入正在运行的容器并启动工人起作用吗?

从reddit重新发布

这里的问题是芹菜工人看不到sqlite数据库。您需要切换到不同的DB或使您的
/app
卷可见

version: '3'

services:
    ...
    celery_worker:
        working_dir: /app
        command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
        image: app-image
        volumes: # <-here
            - .:/app
        depends_on:
            - web
            - redis
    ...
版本:“3”
服务:
...
芹菜工人:
工作目录:/app
命令:sh-c'./wait-for-web:8000&./wait-for-redis:6379--芹菜-A项目工作者-l信息'
图片:应用程序图片
卷:#从reddit重新发布

这里的问题是芹菜工人看不到sqlite数据库。您需要切换到不同的DB或使您的
/app
卷可见

version: '3'

services:
    ...
    celery_worker:
        working_dir: /app
        command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
        image: app-image
        volumes: # <-here
            - .:/app
        depends_on:
            - web
            - redis
    ...
版本:“3”
服务:
...
芹菜工人:
工作目录:/app
命令:sh-c'./wait-for-web:8000&./wait-for-redis:6379--芹菜-A项目工作者-l信息'
图片:应用程序图片

卷:#不可能添加了介于
之间的对象?否,任务由beat调度程序定期运行,每10秒产生相同的结果请重新启动Celery重试不可能添加了介于
之间的对象?否,任务由beat调度程序定期运行,每10秒都会得到同样的结果,请重新启动芹菜
[2019-08-13 05:12:28,945: INFO/MainProcess] Received task: subdiscovery.tasks.mytask[7b2760cf-1e7f-41f8-bc13-fa4042eedf33]  
[2019-08-13 05:12:28,957: WARNING/ForkPoolWorker-8] <QuerySet [<Domain: uber.com>, <Domain: example1.com>, <Domain: example2.com>, <Domain: example3.com>]>
version: '3'

services:
    web:
        build: .
        image: app-image
        ports:
            - 80:8000
        volumes:
            - .:/app
        command: gunicorn -b 0.0.0.0:8000 project.wsgi
    redis:
        image: "redis:alpine"
        ports:
            - 6379:6379
    celery_worker:
        working_dir: /app
        command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
        image: app-image
        depends_on:
            - web
            - redis
    celery_beat:
        working_dir: /app
        command: sh -c 'celery -A project beat -l info'
        image: app-image
        depends_on:
            - celery_worker

version: '3'

services:
    ...
    celery_worker:
        working_dir: /app
        command: sh -c './wait-for web:8000 && ./wait-for redis:6379 -- celery -A project worker -l info'
        image: app-image
        volumes: # <-here
            - .:/app
        depends_on:
            - web
            - redis
    ...