Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/307.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 芹菜击败不发现任务_Python_Django_Celery_Celery Task_Celerybeat - Fatal编程技术网

Python 芹菜击败不发现任务

Python 芹菜击败不发现任务,python,django,celery,celery-task,celerybeat,Python,Django,Celery,Celery Task,Celerybeat,在我的应用程序中,我有使用芹菜运行的任务。我在我的开发环境中毫无压力地建立了它,它作为一个代理与redis完美合作。昨天我将代码传输到我的服务器并安装redis,但芹菜无法发现任务。代码是一样的 我的cellery\u conf.py文件(最初是cellery.py): 设置中的芹菜配置 # Celery Configuration CELERY_TASK_ALWAYS_EAGER = False CELERY_BROKER_URL = SECRETS['celery']['broker_ur

在我的应用程序中,我有使用芹菜运行的任务。我在我的开发环境中毫无压力地建立了它,它作为一个代理与redis完美合作。昨天我将代码传输到我的服务器并安装redis,但芹菜无法发现任务。代码是一样的

我的
cellery\u conf.py
文件(最初是
cellery.py
):

设置中的芹菜配置

# Celery Configuration

CELERY_TASK_ALWAYS_EAGER = False
CELERY_BROKER_URL = SECRETS['celery']['broker_url']
CELERY_RESULT_BACKEND = SECRETS['celery']['result_backend']
CELERY_ACCEPT_CONTENT = ['application/json']
CELERY_TASK_SERIALIZER = 'json'
CELERY_RESULT_SERIALIZER = 'json'
CELERY_TIMEZONE = TIME_ZONE
根应用程序的
\uuuu init\uuuu.py

# coding: utf-8
from __future__ import absolute_import, unicode_literals

from .celery_conf import app as celery_app

__all__ = ['celery_app']
我的任务

# coding=utf-8
from __future__ import unicode_literals, absolute_import

import logging
from celery.schedules import crontab
from celery.task import periodic_task
from .api import fetch_tweets, delete_tweets


logger = logging.getLogger(__name__)


@periodic_task(
    run_every=(crontab(minute=10, hour='0, 6, 12, 18, 23')),
    name="fetch_tweets_task",
    ignore_result=True)
def fetch_tweets_task():
    logger.info("Tweet download started")
    fetch_tweets()
    logger.info("Tweet download and summarization finished")


@periodic_task(
    run_every=(crontab(minute=13, hour=13)),
    name="delete_tweets_task",
    ignore_result=True)
def delete_tweets_task():
    logger.info("Tweet deletion started")
    delete_tweets()
    logger.info("Tweet deletion finished")
在远程服务器中运行时的结果(不工作)

在开发服务器中运行时的结果(工作)

LocalTime->2017-04-03 14:16:19
配置->
. 经纪人->redis://localhost:6379//
. 加载器->芹菜.loaders.app.AppLoader
. 调度器->芹菜.beat.PersistentScheduler
. db->celerybeat时间表
. 日志文件->[stderr]@%DEBUG
. 最大间隔->5.00分钟(300秒)
[2017-04-03 14:16:19919:DEBUG/MainProcess]将默认套接字超时设置为30
[2017-04-03 14:16:19919:INFO/MainProcess]节拍:开始。。。
[2017-04-03 14:16:19952:调试/主进程]当前计划:

我不知道到底是什么问题,但是清除项目中的所有*.pyc文件可以解决问题

我不知道到底是什么问题,但是清除项目中的所有*.pyc文件可以解决问题

# coding=utf-8
from __future__ import unicode_literals, absolute_import

import logging
from celery.schedules import crontab
from celery.task import periodic_task
from .api import fetch_tweets, delete_tweets


logger = logging.getLogger(__name__)


@periodic_task(
    run_every=(crontab(minute=10, hour='0, 6, 12, 18, 23')),
    name="fetch_tweets_task",
    ignore_result=True)
def fetch_tweets_task():
    logger.info("Tweet download started")
    fetch_tweets()
    logger.info("Tweet download and summarization finished")


@periodic_task(
    run_every=(crontab(minute=13, hour=13)),
    name="delete_tweets_task",
    ignore_result=True)
def delete_tweets_task():
    logger.info("Tweet deletion started")
    delete_tweets()
    logger.info("Tweet deletion finished")
(trendiz) kenneth@bots:~/projects/verticals-news/src$ celery -A vertNews beat -l debug
Trying import production.py settings...
celery beat v4.0.2 (latentcall) is starting.
__    -    ... __   -        _
LocalTime -> 2017-04-03 13:55:49
Configuration ->
    . broker -> redis://localhost:6379//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 minutes (300s)
[2017-04-03 13:55:49,770: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 13:55:49,771: INFO/MainProcess] beat: Starting...
[2017-04-03 13:55:49,785: DEBUG/MainProcess] Current schedule:

[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 13:55:49,785: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.
LocalTime -> 2017-04-03 14:16:19
Configuration ->
    . broker -> redis://localhost:6379//
    . loader -> celery.loaders.app.AppLoader
    . scheduler -> celery.beat.PersistentScheduler
    . db -> celerybeat-schedule
    . logfile -> [stderr]@%DEBUG
    . maxinterval -> 5.00 minutes (300s)
[2017-04-03 14:16:19,919: DEBUG/MainProcess] Setting default socket timeout to 30
[2017-04-03 14:16:19,919: INFO/MainProcess] beat: Starting...
[2017-04-03 14:16:19,952: DEBUG/MainProcess] Current schedule:
<ScheduleEntry: fetch_tweets_task fetch_tweets_task() <crontab: 36 0, 6, 12, 18, 22 * * * (m/h/d/dM/MY)>
<ScheduleEntry: delete_tweets_task delete_tweets_task() <crontab: 13 13 * * * (m/h/d/dM/MY)>
[2017-04-03 14:16:19,952: DEBUG/MainProcess] beat: Ticking with max interval->5.00 minutes
[2017-04-03 14:16:19,953: DEBUG/MainProcess] beat: Waking up in 5.00 minutes.