Python使用PyInotify持久化日志文件流
我在通过Python使用PyInotify持久化日志文件流,python,logging,pyinotify,Python,Logging,Pyinotify,我在通过pyinotify及其线程持久化日志文件写入流时遇到问题。我正在使用pyinotify监视目录中的CLOSE\u WRITE文件事件。在初始化pyinotify之前,我使用内置的logging模块创建一个日志流,如下所示: import os, logging from logging import handlers from logging.config import dictConfig log_dir = './var/log' name = 'com.sadmicrowave
pyinotify
及其线程持久化日志文件写入流时遇到问题。我正在使用pyinotify
监视目录中的CLOSE\u WRITE
文件事件。在初始化pyinotify
之前,我使用内置的logging
模块创建一个日志流,如下所示:
import os, logging
from logging import handlers
from logging.config import dictConfig
log_dir = './var/log'
name = 'com.sadmicrowave.tesseract'
LOG_SETTINGS = { 'version' : 1
,'handlers': { 'core': {
# make the logger a rotating file handler so the file automatically gets archived and a new one gets created, preventing files from becoming too large they are unmaintainable.
'class' : 'logging.handlers.RotatingFileHandler'
# by setting our logger to the DEBUG level (lowest level) we will include all other levels by default
,'level' : 'DEBUG'
# this references the 'core' handler located in the 'formatters' dict element below
,'formatter' : 'core'
# the path and file name of the output log file
,'filename' : os.path.join(log_dir, "%s.log" % name)
,'mode' : 'a'
# the max size we want to log file to reach before it gets archived and a new file gets created
,'maxBytes' : 100000
# the max number of files we want to keep in archive
,'backupCount' : 5 }
}
# create the formatters which are referenced in the handlers section above
,'formatters': {'core': {'format': '%(levelname)s %(asctime)s %(module)s|%(funcName)s %(lineno)d: %(message)s'
}
}
,'loggers' : {'root': {
'level' : 'DEBUG' # The most granular level of logging available in the log module
,'handlers' : ['core']
}
}
}
# use the built-in logger dict configuration tool to convert the dict to a logger config
dictConfig(LOG_SETTINGS)
# get the logger created in the config and named root in the 'loggers' section of the config
__log = logging.getLogger('root')
因此,在初始化我的\u log
变量后,它会立即工作,允许进行日志写入。我想接下来启动pyinotify
实例,并希望使用以下类定义传递\uu log
:
import asyncore, pyinotify
class Notify (object):
def __init__ (self, log=None, verbose=True):
wm = pyinotify.WatchManager()
wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose) )
notifier = pyinotify.AsyncNotifier(wm, None)
asyncore.loop()
class processEvent (pyinotify.ProcessEvent):
def __init__ (self, log=None, verbose=True):
log.info('logging some cool stuff')
self.__log = log
self.__verbose = verbose
def process_IN_CLOSE_WRITE (self, event):
print event
在上述实现中,我的process_In_CLOSE_WRITE
方法完全按照预期从pyinotify.AsyncNotifier
触发;但是,记录一些很酷的东西的日志行从不写入日志文件
我觉得它与通过pyinotify线程进程持久化文件流有关;然而,我不知道如何解决这个问题
有什么想法吗?我可能已经找到了一个似乎有效的解决方案。我不确定这是否是最好的方法,所以我将把OP暂时放在一边,看看是否有其他想法被发布
我想我的pyinotify.AsyncNotifier
设置有误。我将实施更改为:
class Notify (object):
def __init__ (self, log=None, verbose=True):
notifiers = []
descriptors = []
wm = pyinotify.WatchManager()
notifiers.append ( pyinotify.AsyncNotifier(wm, processEvent(log, verbose)) )
descriptors.append( wm.add_watch( '/path/to/folder/to/monitor/', pyinotify.IN_CLOSE_WRITE, proc_fun=processEvent(log, verbose), auto_add=True )
asyncore.loop()
现在,当我的包装类processEvents
在侦听器实例化时被触发,并且当事件从CLOSE\u WRITE
事件触发时,log
对象被适当地维护和传递,并且可以接收写事件。不幸的是,一旦我使用python守护进程
和files\u preserve
添加了恶魔化组件,我的日志流再次消失,我的pyinotify事件无法记录到其中