Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/364.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 在进程池之间共享字典和数组_Python_Dictionary_Python Multiprocessing - Fatal编程技术网

Python 在进程池之间共享字典和数组

Python 在进程池之间共享字典和数组,python,dictionary,python-multiprocessing,Python,Dictionary,Python Multiprocessing,我一直在尝试创建一个字典,该字典将设备mac id作为密钥,并在列表中包含与该mac对应的信息 {00-00-0A-14-01-06:[['CMTS-51-55_10.20', '10.20.1.1', '342900', 'Cable6/0/0', '110', 'Cable6/0/0-upstream0', '129', 'Cable6/0/0-downstream', '00-00-0A-14-01-06', '10.20.1.6', '11', '1', '1424419744000',

我一直在尝试创建一个字典,该字典将设备mac id作为密钥,并在列表中包含与该mac对应的信息

{00-00-0A-14-01-06:[['CMTS-51-55_10.20', '10.20.1.1', '342900', 'Cable6/0/0', '110', 'Cable6/0/0-upstream0', '129', 'Cable6/0/0-downstream', '00-00-0A-14-01-06', '10.20.1.6', '11', '1', '1424419744000', '692306', 'SignalingDown', '1', '118800000', '990000', '0', '0', '0', '342900'], 
['CMTS-51-55_10.20', '10.20.1.1', '343800', 'Cable6/0/0', '110', 'Cable6/0/0-upstream0', '129', 'Cable6/0/0-downstream', '00-00-0A-14-01-06', '10.20.1.6', '11', '1', '1424420644000', '692306', 'SignalingDown', '1', '118800000', '990000', '0', '0', '0', '343800'], 
['CMTS-51-55_10.20', '10.20.1.1', '342900', 'Cable6/0/0', '110', 'Cable6/0/0-upstream0', '129', 'Cable6/0/0-downstream', '00-00-0A-14-01-06', '10.20.1.6', '11', '1', '1424419744000', '377773', 'SignalingUp', '2', '118800000', '990000', '0', '0', '0', '342900']]} 
这些数据值从保存在多个文件夹中的多个文件中检索。一个文件夹可以有多个文件

我将此文件夹列表提供给一个进程池。因此,在一个进程中,一个文件夹中的所有文件都将被执行

我正在维护一个本地字典(collection.defaultdict),用完整的信息填充它,然后将该信息放入共享字典(manager.dict),并将其作为pool对象的参数提供

我还提供了一个字符数组,用于在子进程和主进程之间共享一些模板信息

我正在尝试检查多处理部分中的共享任务,但似乎无法使其正常工作

请有人帮帮我

#!/usr/local/bin/pypy

from multiprocessing import Process
from multiprocessing import Pool, Manager ,Value, Array
import collections
from collections import defaultdict
import itertools
import os

def info(title):
    print title
    print 'module name:', __name__
    if hasattr(os, 'getppid'):  # only available on Unix
        print 'parent process:', os.getppid()
    print 'process id:', os.getpid()

def f(template,mydict):
    name = 'bob'
    info('function f')
    resultDeltaArray = collections.defaultdict(list)
    resultDeltaArray['b'].append("hi")
    resultDeltaArray['b'].append("bye")
    resultDeltaArray['c'].append("bye")
    resultDeltaArray['c'].append("bye")
    template = "name"
    print resultDeltaArray
    #print "templaate1", template
    for k,v in resultDeltaArray.viewitems():
        mydict[k] = v
    print 'hello', name
    #mydict = resultDeltaArray
    for k,v in mydict.items():
        print mydict[k]
        #del mydict[k]

if __name__ == '__main__':
    info('main line')
    manager = Manager()
    mydict = manager.dict()
    template = Array('c',50)
    #mydict[''] = []
    #print mydict
    todopool = Pool(2)
    todopool.map_async(f, itertools.repeat(template),itertools.repeat(mydict))
    #print "hi"
    #p = Process(target=f, args=('bob',template,mydict))
    #p.start()
    #p.join()
    print mydict
    mydict.clear()
    print mydict

    print "template2", template
代码用于检查多处理部分。这不是实际的实施。 在这种情况下,它只是挂起,打印后不做任何事情:

main line
module name: __main__
parent process: 27301
process id: 27852
当我尝试使用ctrl-C中断进程时,它在打印后再次被卡住

Traceback (most recent call last):
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/process.py", line 258, in _bootstrap
  Process PoolWorker-2:
Traceback (most recent call last):
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python    /2.7/multiprocessing/process.py", line 258, in _bootstrap
    self.run()
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python /2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/pool.py", line 85, in worker
    self.run()
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/process.py", line 114, in run
    self._target(*self._args, **self._kwargs)
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/pool.py", line 85, in worker
    task = get()
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/queues.py", line 374, in get
    racquire()
KeyboardInterrupt
    task = get()
  File "/home/pydev/checkouts/dev/trunk/thirdparty/pypy_2.1/lib-python/2.7/multiprocessing/queues.py", line 376, in get
    return recv()
我使用东西的方式正确吗?池对象不允许多处理数组或manager.dict作为参数吗?还有其他方法做同样的事情吗

dict(作为内存中的哈希表实现)的设计方式不利于进程之间的共享(进程本质上不共享内存)

考虑使用具有共享内存的线程,可能使用multiprocessing.pool中的
。将ThreadPool导入为pool
。或者使用另一种结构,例如(持久的、可共享的数据存储)。或用于让多个进程访问同一共享数据库。安装和使用或某些其他共享数据存储,旨在跨进程共享


这些文档还展示了如何使用队列和管道跨流程共享数据,但这可能不是您想要的(共享密钥/值存储):

您从来没有时间问过您的问题。我们需要知道发生了什么,以及为什么你认为应该发生其他事情。。。最好在后面跟一个以问号结尾的句子。当我使用过程对象时,使用共享字典就可以了。但当我使用游泳池时,一切都不起作用。使用进程和池创建的子进程之间有区别吗?