Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/347.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/objective-c/26.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python django将自定义记录器添加到芹菜日志_Python_Django_Logging_Celery - Fatal编程技术网

Python django将自定义记录器添加到芹菜日志

Python django将自定义记录器添加到芹菜日志,python,django,logging,celery,Python,Django,Logging,Celery,我在django应用程序中添加了一个自定义日志处理程序,用于将日志条目写入数据库 class DbLogHandler(logging.Handler): # Inherit from logging.Handler def __init__(self): # run the regular Handler __init__ logging.Handler.__init__(self) self.entries = [] l

我在django应用程序中添加了一个自定义日志处理程序,用于将日志条目写入数据库

class DbLogHandler(logging.Handler): # Inherit from logging.Handler
    def __init__(self):
        # run the regular Handler __init__
        logging.Handler.__init__(self)
        self.entries = []
        logging.debug("*****************[DB] INIT db handler")

    def emit(self, record):
        # instantiate the model
        logging.debug("*****************[DB] called emit on db handler")
        try:
            revision_instance = getattr(record, 'revision', None)
            logEntry = MyModel(name=record.name,
                                  log_level_name=record.levelname,
                                  message = record.msg,
                                  module = record.module,
                                  func_name = record.funcName,
                                  line_no = record.lineno,
                                  exception = record.exc_text,
                                  revision = revision_instance
                                  )
            if revision_instance is None:
                return
            self.entries.append(logEntry)

        except Exception as ex:
            print(ex)
        return

    def flush(self):
        if self.entries:
            MyModel.objects.bulk_create(self.entries)
            logging.info("[+] Successfully flushed {0:d} log entries to "
                         "the DB".format(len(self.entries)))
        else:
            logging.info("[*] No log entries for DB logger")
当我直接调用函数时,比如说通过运行管理命令,处理程序被正确使用。然而,在生产中,切入点将是芹菜任务。我的理解是芹菜有它自己的记录机制。我试图做但无法开始工作的是将我的db处理程序添加到芹菜日志中。也就是说,所有芹菜日志也将发送到
DbLogHandler

这就是我试图实现它的方式。在
my\u app.cellery\u logging.logger中

from celery.utils.log import get_task_logger

class CeleryAdapter(logging.LoggerAdapter):
    """Adapter to add current task context to "extra" log fields."""
    def process(self, msg, kwargs):
        if not celery.current_task:
            return msg, kwargs

        kwargs = kwargs.copy()
        kwargs.setdefault('extra', {})['celery'] = \
            vars(celery.current_task.request)
        return msg, kwargs

def task_logger(name):
    """
    Return a custom celery task logger that will also log to db.

    We need to add the db handler explicitly otherwise it is not picked
    up by celery.

    Also, we wrap the logger in a CeleryAdapter to provide some extra celery-
    related context to the logging messages.

    """
    # first get the default celery task logger
    log = get_task_logger(name)

    # if available, add the db-log handler explicitly to the celery task
    # logger
    handlers = settings.LOGGING.get('handlers', [])
    if handlers:
        db_handler_dict = handlers.get('db', None)
        if (db_handler_dict != settings.NULL_HANDLER_PARAMS and
                 db_handler_dict is not None):
            db_handler = {'db': {'class': 'my_app.db_logging.db_logger.DbLogHandler',
                                   'formatter': 'verbose',
                                   'level': 'DEBUG'}}
            log.addHandler(db_handler)

    # wrap the logger by the CeleryAdapter to add some celery specific
    # context to the logs
    return CeleryAdapter(log, {}) 
最后,在我的
task.py

from my_app.celery_logging.logger import task_logger
logger = task_logger(__name__)
但从这一点上说,这是一个痛苦的世界。我甚至无法描述到底发生了什么。当我启动服务器并查看芹菜日志输出时,我看到实际上正在调用我的
db logger
,但芹菜似乎失去了工作人员

[2015-09-18 10:30:57,158: INFO/MainProcess] [*] No log entries for DB logger
Raven is not configured (logging is disabled). Please see the documentation for more information.
2015-09-18 10:30:58,659 raven.contrib.django.client.DjangoClient INFO Raven is not configured (logging is disabled). Please see the documentation for more information.
[2015-09-18 10:30:59,155: DEBUG/MainProcess] | Worker: Preparing bootsteps.
[2015-09-18 10:30:59,157: DEBUG/MainProcess] | Worker: Building graph...
[2015-09-18 10:30:59,158: DEBUG/MainProcess] | Worker: New boot order: {Timer, Hub, Queues (intra), Pool, Autoscaler, Autoreloader, StateDB, Beat, Consumer}
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Preparing bootsteps.
[2015-09-18 10:30:59,161: DEBUG/MainProcess] | Consumer: Building graph...
[2015-09-18 10:30:59,164: DEBUG/MainProcess] | Consumer: New boot order: {Connection, Events, Mingle, Tasks, Control, Gossip, Agent, Heart, event loop}
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Hub
[2015-09-18 10:30:59,167: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,167: DEBUG/MainProcess] | Worker: Starting Pool
[2015-09-18 10:30:59,173: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,173: DEBUG/MainProcess] | Worker: Starting Consumer
[2015-09-18 10:30:59,174: DEBUG/MainProcess] | Consumer: Starting Connection
[2015-09-18 10:30:59,180: INFO/MainProcess] Connected to amqp://guest:**@127.0.0.1:5672//
[2015-09-18 10:30:59,180: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,180: DEBUG/MainProcess] | Consumer: Starting Events
[2015-09-18 10:30:59,188: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:30:59,188: DEBUG/MainProcess] | Consumer: Starting Mingle
[2015-09-18 10:30:59,188: INFO/MainProcess] mingle: searching for neighbors
[2015-09-18 10:31:00,196: INFO/MainProcess] mingle: all alone
[2015-09-18 10:31:00,196: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,197: DEBUG/MainProcess] | Consumer: Starting Tasks
[2015-09-18 10:31:00,203: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,204: DEBUG/MainProcess] | Consumer: Starting Control
[2015-09-18 10:31:00,207: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,208: DEBUG/MainProcess] | Consumer: Starting Gossip
[2015-09-18 10:31:00,211: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,211: DEBUG/MainProcess] | Consumer: Starting Heart
[2015-09-18 10:31:00,212: DEBUG/MainProcess] ^-- substep ok
[2015-09-18 10:31:00,212: DEBUG/MainProcess] | Consumer: Starting event loop
[2015-09-18 10:31:00,213: WARNING/MainProcess] celery@vagrant-base-precise-amd64 ready.
[2015-09-18 10:31:00,213: DEBUG/MainProcess] | Worker: Hub.register Pool...
[2015-09-18 10:31:00,255: ERROR/MainProcess] Unrecoverable error: WorkerLostError('Could not start worker processes',)
Traceback (most recent call last):
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/__init__.py", line 206, in start
    self.blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 374, in start
    return self.obj.start()
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 278, in start
    blueprint.start(self)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/bootsteps.py", line 123, in start
    step.start(parent)
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/consumer.py", line 821, in start
    c.loop(*c.loop_args())
  File "/home/vagrant/.buildout/eggs/celery-3.1.18-py2.7.egg/celery/worker/loops.py", line 48, in asynloop
    raise WorkerLostError('Could not start worker processes')

当调用芹菜任务时,我也看不到任何日志

在配置中将worker\u hijack\u root\u记录器设置为False,并自定义记录器


在配置中将worker\u hijack\u root\u logger设置为False,并自定义记录器


您确定这不是导致工作人员退出的异常情况吗?工作人员丢失发生在我启动服务器之后,因此所有工作人员都应该等待任务。我在问,因为芹菜以任务发现开始,这将导致导入任务模块并
获取任务日志(…)
调用时,它似乎试图访问
设置
,而不先导入它。您可能是对的。当我尝试在shell中使用记录器时,我得到了AttributeError:“dict”对象没有属性“level”
。我投入其中,谢谢你的思考@帕特里斯:那是金点子。我现在也让芹菜登录到我的数据库处理程序。但一旦超出我的任务范围,数据库处理程序就不再被拾取。你知道这是为什么吗?你确定这不是导致工作人员退出的异常吗?工作人员丢失发生在我启动服务器之后,因此所有工作人员都应该等待一个任务。我在问,因为芹菜以任务发现开始,这将导致导入任务模块并
get_task_logger(…)
调用时,它似乎试图访问
设置
,而不先导入它。您可能是对的。当我尝试在shell中使用记录器时,我得到了AttributeError:“dict”对象没有属性“level”。我投入其中,谢谢你的思考@帕特里斯:那是金点子。我现在也让芹菜登录到我的数据库处理程序。但一旦超出我的任务范围,数据库处理程序就不再被拾取。你知道为什么吗?请写一个答案,然后写推荐信。请写一个答案,然后写推荐信。