Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/325.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
python中通过队列传输文件对象_Python_Multiprocessing_Temporary Files - Fatal编程技术网

python中通过队列传输文件对象

python中通过队列传输文件对象,python,multiprocessing,temporary-files,Python,Multiprocessing,Temporary Files,我有一个多处理任务,它处理输入数据并将结果写入临时文件(供以后使用)。但是,当我尝试通过队列将文件句柄传输到父进程时,它失败了(不会引发异常,但队列仍然为空) 有人知道解决方案吗?如果在辅助函数中调用read并在其工作后关闭,则保持打开状态的文件似乎是导致问题的原因: from multiprocessing import Process, Queue def worker(i,queue): my_tmp_file = tempfile.NamedTemporaryFile()

我有一个多处理任务,它处理输入数据并将结果写入临时文件(供以后使用)。但是,当我尝试通过队列将文件句柄传输到父进程时,它失败了(不会引发异常,但队列仍然为空)


有人知道解决方案吗?

如果在辅助函数中调用read并在其工作后关闭,则保持打开状态的文件似乎是导致问题的原因:

from multiprocessing import Process, Queue

def worker(i,queue):
    my_tmp_file = tempfile.NamedTemporaryFile()
    my_tmp_file.write(bytes('Hello world #{}'.format(i), 'utf-8'))
    my_tmp_file.seek(0)
    queue.put(my_tmp_file.read())
    my_tmp_file.close()

q = Queue()

processes = [Process(target=worker, args=(i, q)) for i in range(16)]

for p in processes:
    p.start()

for p in processes:
    p.join()

while q.qsize():
    out = q.get()
    print(out)
如果试图在不读取的情况下关闭文件对象,则会出现
类型错误:无法将“\u io.FileIO”对象
序列化为不可点击的
\u io.FileIO
对象

根据您要执行的操作,将.name放入队列并将delete设置为False,然后重新打开文件可能会有所帮助:

import multiprocessing, tempfile

def worker(i):
    with tempfile.NamedTemporaryFile(delete=False) as my_tmp_file:
        my_tmp_file.write(bytes('Hello world #{}'.format(i), 'utf-8'))
        my_tmp_file.seek(0)
        queue.put(my_tmp_file.name)

queue = multiprocessing.Queue()

print('Writing...')
proc = []
for i in range(16):
    proc.append(multiprocessing.Process(target = worker, args = (i, )))
    proc[i].start()
for p in proc:
    p.join()

print('Reading...')
my_strings = []
while True:
    try:
        tmp_file = queue.get_nowait()
    except Exception as e:
        print('All data are read. Queue is now empty')
        break
    with open(tmp_file) as f:
        my_strings.append(f)
但是您仍然需要重新打开该文件,因此不确定是否会有任何好处

import multiprocessing, tempfile

def worker(i):
    with tempfile.NamedTemporaryFile(delete=False) as my_tmp_file:
        my_tmp_file.write(bytes('Hello world #{}'.format(i), 'utf-8'))
        my_tmp_file.seek(0)
        queue.put(my_tmp_file.name)

queue = multiprocessing.Queue()

print('Writing...')
proc = []
for i in range(16):
    proc.append(multiprocessing.Process(target = worker, args = (i, )))
    proc[i].start()
for p in proc:
    p.join()

print('Reading...')
my_strings = []
while True:
    try:
        tmp_file = queue.get_nowait()
    except Exception as e:
        print('All data are read. Queue is now empty')
        break
    with open(tmp_file) as f:
        my_strings.append(f)