Python 如何从传递给多处理的函数返回计数器字典?

Python 如何从传递给多处理的函数返回计数器字典?,python,python-2.7,return,counter,python-multiprocessing,Python,Python 2.7,Return,Counter,Python Multiprocessing,我有一个CSV文件列表。我想对它们中的每一个执行一组操作,然后生成一个计数器dict,我想从所有CSV文件中删除一个包含单个计数器dict的主列表。我想并行处理每个csv文件,然后从每个文件返回计数器dict。我在这里找到了类似的解决方案: 我使用了David Cullen建议的解决方案。这个解决方案非常适用于字符串,但当我尝试返回计数器dict或普通dict时,所有CSV文件都会被处理,直到send_end.sendresult,执行时它会永远挂在那里,然后抛出内存错误。我在一个Linux服务

我有一个CSV文件列表。我想对它们中的每一个执行一组操作,然后生成一个计数器dict,我想从所有CSV文件中删除一个包含单个计数器dict的主列表。我想并行处理每个csv文件,然后从每个文件返回计数器dict。我在这里找到了类似的解决方案:

我使用了David Cullen建议的解决方案。这个解决方案非常适用于字符串,但当我尝试返回计数器dict或普通dict时,所有CSV文件都会被处理,直到send_end.sendresult,执行时它会永远挂在那里,然后抛出内存错误。我在一个Linux服务器上运行这个命令,它有足够的内存来创建计数器命令列表

我使用了以下代码:

import multiprocessing

#get current working directory
cwd = os.getcwd()

#take a list of all files in cwd
files = os.listdir(cwd)

#defining the function that needs to be done on all csv files
def worker(f,send_end):
    infile= open(f) 
    #read liens in csv file
    lines = infile.readlines()
    #split the lines by "," and store it in a list of lists
    master_lst = [line.strip().split(“,”) for line in lines]
    #extract the second field in each sublist 
    counter_lst = [ element[1] for element in master_lst]
    print “Total elements in the list” + str(len(counter_lst))
    #create a dictionary of count elements
    a = Counter(counter_lst)
    # return the counter dict
    send_end.send(a)

def main():
    jobs = []
    pipe_list = []
    for f in files:
        if f.endswith('.csv'):
           recv_end, send_end = multiprocessing.Pipe(duplex=False)
           p = multiprocessing.Process(target=worker, args=(f, send_end))
           jobs.append(p)
           pipe_list.append(recv_end)
           p.start()

    for proc in jobs:
       proc.join()
    result_list = [x.recv() for x in pipe_list]
    print len(result_list)

if __name__ == '__main__':
     main()
我得到的错误如下:

Process Process-42:
Traceback (most recent call last):
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in 
  _bootstrap
  self.run()
  File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
  self._target(*self._args, **self._kwargs)
  File "/home/amm/python/collapse_multiprocessing_return.py", line 32, in 
  worker
  a = Counter(counter_lst)
  File "/usr/lib64/python2.7/collections.py", line 444, in __init__
  self.update(iterable, **kwds)
  File "/usr/lib64/python2.7/collections.py", line 526, in update
  self[elem] = self_get(elem, 0) + 1
 MemoryError
 Process Process-17:
 Traceback (most recent call last):
 Process Process-6:
 Traceback (most recent call last):
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in 
 _bootstrap
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in 
 _bootstrap
 Process Process-8:
 Traceback (most recent call last):
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in 
 _bootstrap
 self.run()
 self.run()
 self.run()
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
 self._target(*self._args, **self._kwargs)
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
 File "/home/amm/python/collapse_multiprocessing_return.py", line 32, in 
 worker
 self._target(*self._args, **self._kwargs)
 self._target(*self._args, **self._kwargs)
 File "/home/amm/python/collapse_multiprocessing_return.py", line 32, in 
 worker
 File "/home/amm/python/collapse_multiprocessing_return.py", line 32, in 
 worker
 a = Counter(counter_lst_lst)
 a = Counter(counter_lst_lst)
 a = Counter(counter_lst_lst)
 File "/usr/lib64/python2.7/collections.py", line 444, in __init__
 File "/usr/lib64/python2.7/collections.py", line 444, in __init__
 File "/usr/lib64/python2.7/collections.py", line 444, in __init__
 self.update(iterable, **kwds)
 File "/usr/lib64/python2.7/collections.py", line 526, in update
 self[elem] = self_get(elem, 0) + 1
 MemoryError
 self.update(iterable, **kwds)
 self.update(iterable, **kwds)
 File "/usr/lib64/python2.7/collections.py", line 526, in update
 File "/usr/lib64/python2.7/collections.py", line 526, in update
 self[elem] = self_get(elem, 0) + 1
 self[elem] = self_get(elem, 0) + 1
 MemoryError
 MemoryError
 Process Process-10:
 Traceback (most recent call last):
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 258, in 
 _bootstrap
 self.run()
 File "/usr/lib64/python2.7/multiprocessing/process.py", line 114, in run
 self._target(*self._args, **self._kwargs)
 File "/home/amm/python/collapse_multiprocessing_return.py", line 32, in 
 worker
 a = Counter(counter_lst)
 File "/usr/lib64/python2.7/collections.py", line 444, in __init__
 self.update(iterable, **kwds)
 File "/usr/lib64/python2.7/collections.py", line 526, in update
 self[elem] = self_get(elem, 0) + 1
 MemoryError
 ^Z
 [18]+  Stopped                 collapse_multiprocessing_return.py
现在,如果我替换f,文件名将取代send_end.senda中的a。它打印目录中csv文件的数量,这就是lenresult_list在本例中的作用。但是当返回计数器dict a时,它会永远被卡住,抛出上述错误

我希望代码通过计数器dict接收端,不会出现任何错误/问题。附近有工作吗?有人能提出一个可能的解决方案吗

p、 s:我不熟悉多处理模块,如果这个问题听起来很幼稚,我很抱歉。另外,我尝试了multiprocessing.Manager,但也遇到了类似的错误

您的回溯提到了Process-42:,因此至少有42个进程正在创建。您正在为每个CSV文件创建一个进程,这是没有用的,可能会导致内存错误

使用multiprocessing.Pool.map可以更简单地解决您的问题。辅助功能也可以大大缩短:

def worker(f):
    with open(f) as infile:
        return Counter(line.strip().split(",")[1]
                       for line in infile)

def main():
    pool = multiprocessing.Pool()
    result_list = pool.map(worker, [f for f in files if f.endswith('.csv')])

不向池传递任何参数意味着它将创建与CPU核心数量相同的进程。使用更多可能会提高性能,也可能不会提高性能。

1。您的错误消息丢失。2.请提供一个我们可以自己运行的脚本。@AlexHall:现在我已经包括了整个消息日志,直到我终止了脚本的运行。谢谢。看来我自己把事情搞复杂了。谢谢你的代码