Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/363.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181

Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/tensorflow/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 多处理简单函数不';我不工作,但为什么_Python - Fatal编程技术网

Python 多处理简单函数不';我不工作,但为什么

Python 多处理简单函数不';我不工作,但为什么,python,Python,我正在尝试对系统命令进行多处理,但无法使用简单的程序。函数runit(cmd)运行良好,但是 #!/usr/bin/python3 from subprocess import call, run, PIPE,Popen from multiprocessing import Pool import os pool = Pool() def runit(cmd): proc = Popen(cmd, shell=True,stdout=PIPE, stderr=PIPE, univer

我正在尝试对系统命令进行多处理,但无法使用简单的程序。函数runit(cmd)运行良好,但是

#!/usr/bin/python3
from subprocess import call, run, PIPE,Popen
from multiprocessing import Pool
import os
pool = Pool()

def runit(cmd):
    proc = Popen(cmd, shell=True,stdout=PIPE, stderr=PIPE, universal_newlines=True)
    return proc.stdout.read()

#print(runit('ls -l'))

it = []
for i in range(1,3):
    it.append('ls -l')

results = pool.map(runit, it)
它输出:

Process ForkPoolWorker-1:
Process ForkPoolWorker-2:
Traceback (most recent call last):
Traceback (most recent call last):
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.5/multiprocessing/queues.py", line 345, in get
    return ForkingPickler.loads(res)
AttributeError: Can't get attribute 'runit' on <module '__main__' from './syscall.py'>
  File "/usr/lib/python3.5/multiprocessing/process.py", line 249, in _bootstrap
    self.run()
  File "/usr/lib/python3.5/multiprocessing/process.py", line 93, in run
    self._target(*self._args, **self._kwargs)
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 108, in worker
    task = get()
  File "/usr/lib/python3.5/multiprocessing/queues.py", line 345, in get
    return ForkingPickler.loads(res)
AttributeError: Can't get attribute 'runit' on <module '__main__' from './syscall.py'>
processforkPoolWorker-1:
工艺工人-2:
回溯(最近一次呼叫最后一次):
回溯(最近一次呼叫最后一次):
文件“/usr/lib/python3.5/multiprocessing/process.py”,第249行,在引导程序中
self.run()
文件“/usr/lib/python3.5/multiprocessing/process.py”,第93行,正在运行
自我目标(*自我参数,**自我参数)
worker中的文件“/usr/lib/python3.5/multiprocessing/pool.py”,第108行
task=get()
get中的文件“/usr/lib/python3.5/multiprocessing/queues.py”,第345行
返回叉式酸洗器。装载量(res)
AttributeError:无法在上获取属性“runit”
文件“/usr/lib/python3.5/multiprocessing/process.py”,第249行,在引导程序中
self.run()
文件“/usr/lib/python3.5/multiprocessing/process.py”,第93行,正在运行
自我目标(*自我参数,**自我参数)
worker中的文件“/usr/lib/python3.5/multiprocessing/pool.py”,第108行
task=get()
get中的文件“/usr/lib/python3.5/multiprocessing/queues.py”,第345行
返回叉式酸洗器。装载量(res)
AttributeError:无法在上获取属性“runit”
然后它不知怎么地等待,什么也不做,当我按Ctrl+C几次时,它会吐出:

^CProcess ForkPoolWorker-4:
Process ForkPoolWorker-6:
Traceback (most recent call last):
  File "./syscall.py", line 17, in <module>
Process ForkPoolWorker-5:
    results = pool.map(runit, it)
  File "/usr/lib/python3.5/multiprocessing/pool.py", line 260, in map
...
    buf = self._recv(4)
  File "/usr/lib/python3.5/multiprocessing/connection.py", line 379, in _recv
    chunk = read(handle, remaining)
KeyboardInterrupt
^CProcess ForkPoolWorker-4:
工艺工人-6:
回溯(最近一次呼叫最后一次):
文件“/syscall.py”,第17行,在
过程5-5:
结果=pool.map(runit,it)
文件“/usr/lib/python3.5/multiprocessing/pool.py”,第260行,在地图中
...
buf=自我记录(4)
文件“/usr/lib/python3.5/multiprocessing/connection.py”,第379行,在
区块=读取(句柄,剩余)
键盘中断

我不确定,因为我知道的问题与windows相关(我没有访问Linux box的权限来重新编译),但为了便于移植,如果uuu name\uuuuu==“uuu main\uuuu>或者它与python生成进程的方式冲突,则必须将依赖于多处理的命令包装在
中:该修复示例在windows上运行良好(在其他平台上也可以正常工作):

(更仔细地研究错误消息,现在我非常确定,在声明
runit
之后移动
pool=pool()
也可以在Linux上修复它,但在
\uuuu main\uu
fixes+中包装会使它具有可移植性)

也就是说,请注意,您的多处理只是创建了一个新进程,因此您最好使用线程池():创建进程的线程,如下所示:

from multiprocessing.pool import ThreadPool  # uses threads, not processes
import os

def runit(cmd):
    proc = Popen(cmd, shell=True,stdout=PIPE, stderr=PIPE, universal_newlines=True)
    return proc.stdout.read()

it = []
for i in range(1,3):
    it.append('ls -l')

if __name__=="__main__":
    pool = ThreadPool()   # ThreadPool instead of Pool
    results = pool.map(runit, it)
    print(results)
        results = pool.map(runit, it)
        print(results)

后一种解决方案更轻量级,更不容易出现问题(多处理是一个需要处理的微妙模块)。您将能够使用对象、共享数据等。无需使用
管理器
对象,以及其他优点

请注意,使用多线程而不是多处理会更好。
子进程。Popen
已经调用了多处理。多线程更容易处理我在Linux下运行它Mint 18.1 KDE和您的解决方案非常有用!我确信线程解决方案可以工作,您能确认修改后的多处理解决方案也可以工作吗?是的。但是,运行100个任务需要在4个cpu上使用ThreadPool和thread实现相同数量的任务,但我理解其中的区别,并选择ThreadPoolcrea终止线程比创建进程快。保持线程化。如果您必须执行纯python循环之类的操作,那么您必须切换到多处理,因为GIL确保只有一个线程在运行(这是内存完整性所必需的,所选择的设计是为了避免使用互斥来保护列表、dict…)我也有同样的问题,但它挂起时没有出错。我感觉它的内存相关,因为在我的例子中,进程最大内存使用率在挂起前上升了98%。
from multiprocessing.pool import ThreadPool  # uses threads, not processes
import os

def runit(cmd):
    proc = Popen(cmd, shell=True,stdout=PIPE, stderr=PIPE, universal_newlines=True)
    return proc.stdout.read()

it = []
for i in range(1,3):
    it.append('ls -l')

if __name__=="__main__":
    pool = ThreadPool()   # ThreadPool instead of Pool
    results = pool.map(runit, it)
    print(results)
        results = pool.map(runit, it)
        print(results)