Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/289.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python ProcessPoolExecutor日志记录失败?_Python_Logging_Python Multiprocessing - Fatal编程技术网

Python ProcessPoolExecutor日志记录失败?

Python ProcessPoolExecutor日志记录失败?,python,logging,python-multiprocessing,Python,Logging,Python Multiprocessing,我正在创建一个多处理程序来处理多个批次,但我的日志记录无法将批次记录到日志文件中,只记录root log.info,如何设置日志记录以正确打印到日志文件 日志将只打印这样一行“INFO:root:这是根日志记录 " 在windows/python2.7上运行的日志记录在每个子进程中使用不同的实例,无法写入同一文件。Apply follow fix将解决这个问题,但我认为更好的解决方案可能是使用logging.getlogger('abc')的单例模式 嗯,这很奇怪,我想我得到了一个不同的输出,因

我正在创建一个多处理程序来处理多个批次,但我的日志记录无法将批次记录到日志文件中,只记录root log.info,如何设置日志记录以正确打印到日志文件

日志将只打印这样一行“INFO:root:这是根日志记录 "


在windows/python2.7上运行的日志记录在每个子进程中使用不同的实例,无法写入同一文件。Apply follow fix将解决这个问题,但我认为更好的解决方案可能是使用logging.getlogger('abc')的单例模式


嗯,这很奇怪,我想我得到了一个不同的输出,因为unix系统。(两个日志工作正常)。看起来windows在这种情况下有不同的行为。@Taras,有没有办法在日志记录中解决这个问题?我想日志文件被某种方式阻止了,但老实说,我不知道。我花了30分钟试图让**在windows上运行,我不想继续在那个系统上做任何事情;(祝你好运,伙计!@TarasMatsyk找到了解决办法。
import logging
import concurrent.futures
def process_batchs():
    batches = [i for i in range(100)]
    logging.basicConfig(filename=r'doc\test_ProcessPoolExecutor.log', filemode='w+',level=logging.DEBUG)
    logging.info('this is root logging')
    with concurrent.futures.ProcessPoolExecutor(10) as e:
        futures = []
        for batch in batches:
            future = e.submit(f, batch)
            futures.append(future)
        while True:
            dones = [future.done() for future in futures]
            if all(dones):
               results = [future.result() for future in futures]
               print results
               break
def f(batch):
    # do some thing
    logging.info('this is sub logging' + str(batch))
    return batch


if __name__ == '__main__':
    process_batchs()
import logging
import concurrent.futures
def process_batchs():
    batches = [i for i in range(100)]
    logging.basicConfig(filename=r'test_ProcessPoolExecutor.log', filemode='w+',level=logging.DEBUG)
    logging.info('this is root logging')
    with concurrent.futures.ProcessPoolExecutor(10) as e:
        futures = []
        for batch in batches:
            future = e.submit(f, batch)
            futures.append(future)
        while True:
            dones = [future.done() for future in futures]
            if all(dones):
               results = [future.result() for future in futures]
               print results
               break
def f(batch):
    # do some thing
    # Here is the trick, notice here!!!
    ########
    logging.basicConfig(filename=r'test_ProcessPoolExecutor.log', filemode='w+',level=logging.DEBUG)
    ########
    logging.info('this is sub logging' + str(batch))
    return batch


if __name__ == '__main__':
    process_batchs()