Python 异步IO队列收集协同路由';s返回值
我有一个clickhouse数据库,我想在其中执行异步查询,每个查询在不同的节点上。 我找到了我需要的例子。 我对它做了一点修改(见下文),它可以工作,但是Python 异步IO队列收集协同路由';s返回值,python,queue,return-value,python-asyncio,Python,Queue,Return Value,Python Asyncio,我有一个clickhouse数据库,我想在其中执行异步查询,每个查询在不同的节点上。 我找到了我需要的例子。 我对它做了一点修改(见下文),它可以工作,但是 from aioch import Client result_set = [] async def exec_sql(name, queue, client): while True: print('name =',name) sql = await queue.get() r
from aioch import Client
result_set = []
async def exec_sql(name, queue, client):
while True:
print('name =',name)
sql = await queue.get()
result_set.append(await client.execute(sql))
# Notify the queue that the "work item" has been processed.
queue.task_done()
async def main():
num_of_nodes = 10
num_of_sqls = 20
ports = range(2441, 2451)
clients = [Client(host='localhost', port=port, database='database', compression=True) for port in ports]
# Create a queue that we will use to store our "workload".
queue = asyncio.Queue()
# Generate sql's
for _ in range(num_of_sqls):
sql = 'select hostName()'
queue.put_nowait(sql)
# Create worker tasks to process the queue concurrently.
tasks = []
for i in range(num_of_nodes):
task = asyncio.create_task(exec_sql(f'worker-{i}', queue, clients[i]))
tasks.append(task)
# Wait until the queue is fully processed.
await queue.join()
# Cancel our worker tasks.
for task in tasks:
task.cancel()
# Wait until all worker tasks are cancelled.
await asyncio.gather(*tasks, return_exceptions=True)
asyncio.run(main())
print(result_set)
有没有比在一开始就定义一个空数组“result\u set”更好的方法来收集每个查询的结果?您应该能够在那里返回等待的执行。。那样的话,它就被缝合在一起了。。i、 e
async def exec_sql(name, queue, client):
while True:
print('name =',name)
sql = await queue.get()
result = await client.execute(sql)
# Notify the queue that the "work item" has been processed.
queue.task_done()
return result
你有没有想过如何通过收集得到结果?