Python 多处理:如何在使用pool.map时为每个实例编写单独的日志文件?

Python 多处理:如何在使用pool.map时为每个实例编写单独的日志文件?,python,class,logging,multiprocessing,instance,Python,Class,Logging,Multiprocessing,Instance,我想创建一个类,其中每个实例都编写自己的日志文件。当我使用函数而不是类时(或当我不使用多处理时),这很好: 但当我使用类时,我会出现酸洗错误: import multiprocessing, logging class MyClass(object): def __init__(self,A): print A self.logger = self.setup_logger('Logfile%s' %A, '/dev/shm/Logfile%s.log'

我想创建一个类,其中每个实例都编写自己的日志文件。当我使用函数而不是类时(或当我不使用多处理时),这很好:

但当我使用类时,我会出现酸洗错误:

import multiprocessing, logging

class MyClass(object):
    def __init__(self,A):
        print A
        self.logger = self.setup_logger('Logfile%s' %A, '/dev/shm/Logfile%s.log' %A)
        self.logger.info('text to be written to logfile')

    def setup_logger(self,name_logfile, path_logfile):
        logger = logging.getLogger(name_logfile)
        formatter = logging.Formatter('%(asctime)s:   %(message)s', datefmt='%Y/%m/%d %H:%M:%S')
        fileHandler = logging.FileHandler(path_logfile, mode='w')
        fileHandler.setFormatter(formatter)
        streamHandler = logging.StreamHandler()
        streamHandler.setFormatter(formatter)

        logger.setLevel(logging.DEBUG)
        logger.addHandler(fileHandler)
        logger.addHandler(streamHandler)
        return logger

pool = multiprocessing.Pool(2)
pool.map(MyClass,[1,2])
pool.close()
pool.join()
输出:

1
2.
2015/02/12 14:05:09:要写入日志文件的文本
2015/02/12 14:05:09:要写入日志文件的文本
流程池工人-1:
回溯(最近一次呼叫最后一次):
文件“/usr/lib64/python2.7/multiprocessing/process.py”,第258行,在_bootstrap中
self.run()
文件“/usr/lib64/python2.7/multiprocessing/process.py”,第114行,正在运行
自我目标(*自我参数,**自我参数)
worker中的文件“/usr/lib64/python2.7/multiprocessing/pool.py”,第99行
流程池工人-2:
put((作业、i、结果))
文件“/usr/lib64/python2.7/multiprocessing/queues.py”,第392行,输入
回溯(最近一次呼叫最后一次):
文件“/usr/lib64/python2.7/multiprocessing/process.py”,第258行,在_bootstrap中
返回发送(obj)
PicklingError:无法pickle:属性查找线程.lock失败
self.run()
文件“/usr/lib64/python2.7/multiprocessing/process.py”,第114行,正在运行
自我目标(*自我参数,**自我参数)
worker中的文件“/usr/lib64/python2.7/multiprocessing/pool.py”,第99行
put((作业、i、结果))
文件“/usr/lib64/python2.7/multiprocessing/queues.py”,第392行,输入
返回发送(obj)
PicklingError:无法pickle:属性查找线程.lock失败

由于每个日志文件都有自己的输出路径,我无法找出这个错误的原因。我需要logger作为对象的一个属性,那么如何解决这个酸洗错误呢?

基本上,您希望调用多处理.get\u logger()而不是logging.getLogger()

请参阅第一个答案,您无法对记录器进行pickle。 相反,您可以在对象被pickle和unpickle时移除并重置记录器:

import multiprocessing, logging


class MyClass(object):

   def __init__(self,A):
        print A
        self.A = A # we need to keep the name!
        self.logger = self.setup_logger('Logfile%s' %A, '/misc/hy5/scheffler/Skripte_Models/python/Tests/Logfile%s.log' %A)
        self.logger.info('text to be written to logfile')

    def setup_logger(self,name_logfile, path_logfile):
        logger = logging.getLogger(name_logfile)
        formatter = logging.Formatter('%(asctime)s:   %(message)s', datefmt='%Y/%m/%d %H:%M:%S')
        fileHandler = logging.FileHandler(path_logfile, mode='w')
        fileHandler.setFormatter(formatter)
        streamHandler = logging.StreamHandler()
        streamHandler.setFormatter(formatter)

        logger.setLevel(logging.DEBUG)
        logger.addHandler(fileHandler)
        logger.addHandler(streamHandler)
        return logger

    def __getstate__(self):
        """Called for pickling.

        Removes the logger to allow pickling and returns a copy of `__dict__`.

        """
        statedict = self.__dict__.copy()
        if 'logger' in statedict:
            # Pickling does not work with loggers objects, so we just keep the logger's name:
            del statedict['logger']
        return statedict

    def __setstate__(self, statedict):
        """Called after loading a pickle dump.

        Restores `__dict__` from `statedict` and adds a new logger.

        """
        self.__dict__.update(statedict)
        process_name = multiprocessing.current_process().name
        self.logger = self.setup_logger('Logfile%s' % self.A, 
                       '/dev/shm/Logfile%s_%s.log' % (self.A, process_name)
请注意,我们将进程名称添加到日志文件中,以避免多个进程操作同一个文件!您可能还希望确保日志处理程序和相应的文件在某个时候关闭

编辑:


在多处理模块中有一个。然而,我总是觉得这一点太局限了。

非常感谢!这帮助我解决了这个错误。
1
2
2015/02/12 14:05:09:   text to be written to logfile
2015/02/12 14:05:09:   text to be written to logfile
Process PoolWorker-1:
Traceback (most recent call last):
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib64/python2.7/multiprocessing/pool.py", line 99, in worker
Process PoolWorker-2:
    put((job, i, result))
  File "/usr/lib64/python2.7/multiprocessing/queues.py", line 392, in put
Traceback (most recent call last):
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in _bootstrap
    return send(obj)
PicklingError: Can't pickle <type 'thread.lock'>: attribute lookup thread.lock failed
    self.run()
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib64/python2.7/multiprocessing/pool.py", line 99, in worker
    put((job, i, result))
  File "/usr/lib64/python2.7/multiprocessing/queues.py", line 392, in put
    return send(obj)
PicklingError: Can't pickle <type 'thread.lock'>: attribute lookup thread.lock failed
import multiprocessing, logging


class MyClass(object):

   def __init__(self,A):
        print A
        self.A = A # we need to keep the name!
        self.logger = self.setup_logger('Logfile%s' %A, '/misc/hy5/scheffler/Skripte_Models/python/Tests/Logfile%s.log' %A)
        self.logger.info('text to be written to logfile')

    def setup_logger(self,name_logfile, path_logfile):
        logger = logging.getLogger(name_logfile)
        formatter = logging.Formatter('%(asctime)s:   %(message)s', datefmt='%Y/%m/%d %H:%M:%S')
        fileHandler = logging.FileHandler(path_logfile, mode='w')
        fileHandler.setFormatter(formatter)
        streamHandler = logging.StreamHandler()
        streamHandler.setFormatter(formatter)

        logger.setLevel(logging.DEBUG)
        logger.addHandler(fileHandler)
        logger.addHandler(streamHandler)
        return logger

    def __getstate__(self):
        """Called for pickling.

        Removes the logger to allow pickling and returns a copy of `__dict__`.

        """
        statedict = self.__dict__.copy()
        if 'logger' in statedict:
            # Pickling does not work with loggers objects, so we just keep the logger's name:
            del statedict['logger']
        return statedict

    def __setstate__(self, statedict):
        """Called after loading a pickle dump.

        Restores `__dict__` from `statedict` and adds a new logger.

        """
        self.__dict__.update(statedict)
        process_name = multiprocessing.current_process().name
        self.logger = self.setup_logger('Logfile%s' % self.A, 
                       '/dev/shm/Logfile%s_%s.log' % (self.A, process_name)