Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/300.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
python多处理的内存使用_Python_Multiprocessing - Fatal编程技术网

python多处理的内存使用

python多处理的内存使用,python,multiprocessing,Python,Multiprocessing,由多处理生成的进程所消耗的内存是否会在该进程加入后释放 我心目中的情景大致如下: from multiprocessing import Process from multiprocessing import Queue import time import os def main(): tasks = Queue() for task in [1, 18, 1, 2, 5, 2]: tasks.put(task) num_proc = 3 # t

多处理生成的进程所消耗的内存是否会在该进程加入后释放

我心目中的情景大致如下:

from multiprocessing import Process
from multiprocessing import Queue
import time
import os

def main():
  tasks = Queue()  
  for task in [1, 18, 1, 2, 5, 2]:
    tasks.put(task)

  num_proc = 3           # this many workers @ each point in time
  procs = []
  for j in range(num_proc):
     p = Process(target = run_q, args = (tasks,))  
     procs.append(p)
     p.start()

  # joines a worker once he's done
  while procs:
    for p in procs:
        if not p.is_alive():
            p.join()        # what happens to the memory allocated by run()?  
            procs.remove(p)
            print p, len(procs)
    time.sleep(1)  

def run_q(task_q):
    while not task_q.empty():  # while's stuff to do, keep working
        task = task_q.get()
        run(task)

def run(x):       # do real work, allocates memory
    print x, os.getpid()
    time.sleep(3*x)


if __name__ == "__main__":
  main()  

在实际代码中,
任务的长度
比CPU核的数量大得多,每个
任务
都是轻量级的,不同的任务占用的CPU时间(几分钟到几天)和内存量都大不相同(从花生到几GB)。所有这些内存都是
运行
的本地内存,不需要共享它——因此问题是,一旦
运行
返回,和/或进程加入,是否释放了这些内存

当进程终止时,进程消耗的内存被释放。在您的示例中,当run_q()返回时会发生这种情况