在Python中,调用进程池而不阻塞事件循环
如果我运行以下代码:在Python中,调用进程池而不阻塞事件循环,python,python-multiprocessing,python-asyncio,coroutine,event-loop,Python,Python Multiprocessing,Python Asyncio,Coroutine,Event Loop,如果我运行以下代码: import asyncio import time import concurrent.futures def cpu_bound(mul): for i in range(mul*10**8): i+=1 print('result = ', i) return i async def say_after(delay, what): print('sleeping async...') await asynci
import asyncio
import time
import concurrent.futures
def cpu_bound(mul):
for i in range(mul*10**8):
i+=1
print('result = ', i)
return i
async def say_after(delay, what):
print('sleeping async...')
await asyncio.sleep(delay)
print(what)
# The run_in_pool function must not block the event loop
async def run_in_pool():
with concurrent.futures.ProcessPoolExecutor() as executor:
result = executor.map(cpu_bound, [1, 1, 1])
async def main():
task1 = asyncio.create_task(say_after(0.1, 'hello'))
task2 = asyncio.create_task(run_in_pool())
task3 = asyncio.create_task(say_after(0.1, 'world'))
print(f"started at {time.strftime('%X')}")
await task1
await task2
await task3
print(f"finished at {time.strftime('%X')}")
if __name__ == '__main__':
asyncio.run(main())
输出为:
started at 18:19:28
sleeping async...
result = 100000000
result = 100000000
result = 100000000
sleeping async...
hello
world
finished at 18:19:34
started at 18:16:23
sleeping async...
sleeping async...
hello
world
result = 100000000
finished at 18:16:28
这表明事件循环会一直阻塞,直到cpu绑定的作业task2完成,然后继续执行task3
如果我只运行一个cpu绑定作业,则\u池中的运行\u如下所示:
async def run_in_pool():
loop = asyncio.get_running_loop()
with concurrent.futures.ProcessPoolExecutor() as executor:
result = await loop.run_in_executor(executor, cpu_bound, 1)
然后,由于输出为:
started at 18:19:28
sleeping async...
result = 100000000
result = 100000000
result = 100000000
sleeping async...
hello
world
finished at 18:19:34
started at 18:16:23
sleeping async...
sleeping async...
hello
world
result = 100000000
finished at 18:16:28
如何在进程池中运行task2中许多cpu绑定的作业而不阻塞事件循环?正如您所发现的,您需要使用asyncio自己的run_in_executor来等待提交的任务完成而不阻塞事件循环。Asyncio不提供与map等效的功能,但不难对其进行仿真:
async def run_in_pool():
with concurrent.futures.ProcessPoolExecutor() as executor:
futures = [loop.run_in_executor(executor, cpu_bound, i)
for i in (1, 1, 1)]
result = await asyncio.gather(*futures)
谢谢,这就是我要找的。本质上,您需要在run_in_pool协程中使用wait语句将控制权交还给事件循环。实际上,本主题的正确问题是:如何以可以等待的方式模拟executor.map方法,以便它不会阻塞事件循环。