Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/307.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何在Python中运行两个并行请求并连接结果_Python_Multithreading_Python 2.7_Parallel Processing - Fatal编程技术网

如何在Python中运行两个并行请求并连接结果

如何在Python中运行两个并行请求并连接结果,python,multithreading,python-2.7,parallel-processing,Python,Multithreading,Python 2.7,Parallel Processing,我有以下代码: def do_smth(query): result_1 = api_request_1(query) # ['1', '2', '3'] result_2 = api_request_2(query) # ['a', 'b', 'c'] return result_1 + result_2 # ['1', '2', '3', 'a', 'b', 'c'] 现在我想并行运行这些请求并合并结果。因此,我: def do_smth_parallel

我有以下代码:

def do_smth(query):
    result_1 = api_request_1(query) # ['1', '2', '3']
    result_2 = api_request_2(query) # ['a', 'b', 'c']
    return result_1 + result_2      # ['1', '2', '3', 'a', 'b', 'c']
现在我想并行运行这些请求并合并结果。因此,我:

def do_smth_parallel(query):
    pool = Pool(processes=2)

    result = []
    arg = [ query ]
    result.extend(pool.map(api_request_1, arg)[0])
    result.extend(pool.map(api_request_2, arg)[0])

    pool.close()
    pool.join()

    return result
到目前为止还不错,但是
map
是一个阻塞功能。所以<代码>并行不太并行并行:) 我该怎么做


另外,在Java中,我将使用一个
ExecutorService
和两个
Future
S来实现您正在寻找的
map\u async
而不是
map
。这是你的例子。这可以应用于任意数量的函数调用。所有这些都将异步执行

def do_smth_parallel(query):
    pool = Pool(processes=2)

    result = []
    arg = [ query ]
    future_1 = pool.async_map(api_request_1, arg)
    future_2 = pool.async_map(api_request_2, arg)

    result_1 = future_1.get()
    results_2 = future_2.get()

    pool.close()
    pool.join()

    return result_1 + result_2

您正在寻找
map\u async
而不是
map
。这是你的例子。这可以应用于任意数量的函数调用。所有这些都将异步执行

def do_smth_parallel(query):
    pool = Pool(processes=2)

    result = []
    arg = [ query ]
    future_1 = pool.async_map(api_request_1, arg)
    future_2 = pool.async_map(api_request_2, arg)

    result_1 = future_1.get()
    results_2 = future_2.get()

    pool.close()
    pool.join()

    return result_1 + result_2

另一种方法是使用
concurrent.futures
包:

from concurrent.futures import Executor

def do_smth_parallel(query):
    exc = Executor()

    req1 = exc.submit(api_request_1, query)
    req2 = exc.submit(api_request_2, query)

    return req1.result() + req2.result()

另一种方法是使用
concurrent.futures
包:

from concurrent.futures import Executor

def do_smth_parallel(query):
    exc = Executor()

    req1 = exc.submit(api_request_1, query)
    req2 = exc.submit(api_request_2, query)

    return req1.result() + req2.result()

非常感谢。这解决了问题,但@Turn的方式更高级,谢谢。这解决了问题,但@Turn的方式更高级