Warning: file_get_contents(/data/phpspider/zhask/data//catemap/8/file/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
File 文件创建时事件通知的并行化_File_Parallel Processing_Python Multithreading_Inotify - Fatal编程技术网

File 文件创建时事件通知的并行化

File 文件创建时事件通知的并行化,file,parallel-processing,python-multithreading,inotify,File,Parallel Processing,Python Multithreading,Inotify,当在目录(此处为tmp)中创建文件或文件夹时,我使用“Inotify”记录事件。这里的示例以串行进程的形式执行此任务。也就是说,所有的文件创建都是按顺序依次处理的 import logging import inotify.adapters _DEFAULT_LOG_FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s' _LOGGER = logging.getLogger(__name__) def _confi

当在目录(此处为tmp)中创建文件或文件夹时,我使用“Inotify”记录事件。这里的示例以串行进程的形式执行此任务。也就是说,所有的文件创建都是按顺序依次处理的

import logging

import inotify.adapters

_DEFAULT_LOG_FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'

_LOGGER = logging.getLogger(__name__)

def _configure_logging():
    _LOGGER.setLevel(logging.DEBUG)

    ch = logging.StreamHandler()

    formatter = logging.Formatter(_DEFAULT_LOG_FORMAT)
    ch.setFormatter(formatter)

    _LOGGER.addHandler(ch)

def _main():
    i = inotify.adapters.Inotify()

    i.add_watch(b'/tmp')

    try:
        for event in i.event_gen():
            if event is not None:
                (header, type_names, watch_path, filename) = event
                _LOGGER.info("WD=(%d) MASK=(%d) COOKIE=(%d) LEN=(%d) MASK->NAMES=%s "
                             "WATCH-PATH=[%s] FILENAME=[%s]",
                             header.wd, header.mask, header.cookie, header.len, type_names,
                             watch_path.decode('utf-8'), filename.decode('utf-8'))
    finally:
        i.remove_watch(b'/tmp')

if __name__ == '__main__':
    _configure_logging()
    _main()
我想介绍事件通知的并行化,以防通过导入线程上传多个文件,我是否应该添加线程作为循环?
第二个问题,我不确定把线程函数放在哪里会有意义

以下脚本处理多个会话情况下的多个事件。就我而言,这就足够了。我添加了多处理选项,而不是线程。我发现多处理比线程更快

 import logging
 import threading
 import inotify.adapters
 import multiprocessing  

 _DEFAULT_LOG_FORMAT = '%(asctime)s - %(name)s - %(levelname)s - %(message)s'

 _LOGGER = logging.getLogger(__name__)

 def _configure_logging():
     _LOGGER.setLevel(logging.DEBUG)

     ch = logging.StreamHandler()

     formatter = logging.Formatter(_DEFAULT_LOG_FORMAT)
     ch.setFormatter(formatter)

     _LOGGER.addHandler(ch)



 def PopUpMessage (event):
     if event is not None:
         (header, type_names, watch_path, filename) = event
         _LOGGER.info("WD=(%d) MASK=(%d) COOKIE=(%d) LEN=(%d) MASK->NAMES=%s "
             "WATCH-PATH=[%s] FILENAME=[%s]",
             header.wd, header.mask, header.cookie, header.len, type_names,
             watch_path.decode('utf-8'), filename.decode('utf-8'))


 def My_main(count):
     i = inotify.adapters.Inotify()
     DirWatcher=i.add_watch(b'/PARA')
     try:
         while True: 
             for event in i.event_gen():
                 m = multiprocessing.Process(target=PopUpMessage, args=(event,))
                 m.start()            

     finally:
         i.remove_watch(b'/PARA')

 if __name__ == '__main__':
     _configure_logging()
     N = multiprocessing.Process(target=My_main)
     N.start()