Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/299.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python 异步IO队列收集协同路由';s返回值_Python_Queue_Return Value_Python Asyncio - Fatal编程技术网

Python 异步IO队列收集协同路由';s返回值

Python 异步IO队列收集协同路由';s返回值,python,queue,return-value,python-asyncio,Python,Queue,Return Value,Python Asyncio,我有一个clickhouse数据库,我想在其中执行异步查询,每个查询在不同的节点上。 我找到了我需要的例子。 我对它做了一点修改(见下文),它可以工作,但是 from aioch import Client result_set = [] async def exec_sql(name, queue, client): while True: print('name =',name) sql = await queue.get() r

我有一个clickhouse数据库,我想在其中执行异步查询,每个查询在不同的节点上。 我找到了我需要的例子。 我对它做了一点修改(见下文),它可以工作,但是

from aioch import Client

result_set = []


async def exec_sql(name, queue, client):
    while True:
        print('name =',name)
        sql = await queue.get()
        result_set.append(await client.execute(sql))
        # Notify the queue that the "work item" has been processed.
        queue.task_done()

async def main():
    num_of_nodes = 10
    num_of_sqls = 20

    ports = range(2441, 2451)

    clients = [Client(host='localhost', port=port, database='database', compression=True) for port in ports]


    # Create a queue that we will use to store our "workload".
    queue = asyncio.Queue()

    # Generate sql's
    for _ in range(num_of_sqls):
        sql = 'select hostName()'
        queue.put_nowait(sql)

    # Create worker tasks to process the queue concurrently.
    tasks = []
    for i in range(num_of_nodes):
        task = asyncio.create_task(exec_sql(f'worker-{i}', queue, clients[i]))
        tasks.append(task)

    # Wait until the queue is fully processed.
    await queue.join()

    # Cancel our worker tasks.
    for task in tasks:
        task.cancel()
    # Wait until all worker tasks are cancelled.
    await asyncio.gather(*tasks, return_exceptions=True)

asyncio.run(main())

print(result_set)

有没有比在一开始就定义一个空数组“result\u set”更好的方法来收集每个查询的结果?

您应该能够在那里返回等待的执行。。那样的话,它就被缝合在一起了。。i、 e

async def exec_sql(name, queue, client):
  while True:
    print('name =',name)
    sql = await queue.get()
    result = await client.execute(sql)
    # Notify the queue that the "work item" has been processed.
    queue.task_done()
    return result

你有没有想过如何通过收集得到结果?