Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/364.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
如何从python中的线程池中获取结果?_Python_Multithreading_Queue_Return Value - Fatal编程技术网

如何从python中的线程池中获取结果?

如何从python中的线程池中获取结果?,python,multithreading,queue,return-value,Python,Multithreading,Queue,Return Value,我在这里搜索了如何在python中执行线程,但到目前为止,我还没有得到我需要的答案。 我对队列和线程python类不是很熟悉,因此这里给出的一些答案对我来说毫无意义 我想创建一个线程池,我可以给它分配不同的任务,当所有线程都结束时,得到结果值并处理它们。 到目前为止,我已经尝试过这样做,但我无法得到结果。我写的代码是: from threading import Thread from Queue import Queue class Worker(Thread): """Thread

我在这里搜索了如何在python中执行线程,但到目前为止,我还没有得到我需要的答案。 我对队列和线程python类不是很熟悉,因此这里给出的一些答案对我来说毫无意义

我想创建一个线程池,我可以给它分配不同的任务,当所有线程都结束时,得到结果值并处理它们。 到目前为止,我已经尝试过这样做,但我无法得到结果。我写的代码是:

from threading import Thread
from Queue import Queue

class Worker(Thread):
    """Thread executing tasks from a given tasks queue"""
    def __init__(self, tasks):
        Thread.__init__(self)
        self.tasks = tasks
        self.daemon = True
        self.result = None
        self.start()
    def run(self):
        while True:
            func, args, kargs = self.tasks.get()
            try:
                self.result = func(*args, **kargs)
            except Exception, e:
                print e
            self.tasks.task_done()
    def get_result(self):
        return self.result

class ThreadPool:
    """Pool of threads consuming tasks from a queue"""
    def __init__(self, num_threads):
        self.tasks = Queue(num_threads)
        self.results = []
        for _ in range(num_threads):
            w = Worker(self.tasks)
            self.results.append(w.get_result())
    def add_task(self, func, *args, **kargs):
        """Add a task to the queue"""
        self.tasks.put((func, args, kargs))
    def wait_completion(self):
        """Wait for completion of all the tasks in the queue"""
        self.tasks.join()
    def get_results(self):
        return self.results

def foo(word, number):
    print word*number
    return number

words = ['hello', 'world', 'test', 'word', 'another test']
numbers = [1,2,3,4,5]
pool = ThreadPool(5)
for i in range(0, len(words)):
    pool.add_task(foo, words[i], numbers[i])

pool.wait_completion()
results = pool.get_results()
print results
输出输出输出的字符串是单词给定乘以给定的数字,但是结果列表中满是无值,因此我应该将func的返回值放在哪里

或者,简单的方法是创建一个列表,在该列表中填充队列,并添加一个字典或一些变量以将结果作为参数存储到函数中,然后在将任务添加到队列后,将此结果参数添加到结果列表中:

def foo(word, number, r):
    print word*number
    r[(word,number)] = number
    return number

words = ['hello', 'world', 'test', 'word', 'another test']
numbers = [1,2,3,4,5]
pool = ThreadPool(5)
results = []
for i in range(0, len(words)):
    r = {}
    pool.add_task(foo, words[i], numbers[i], r)
    results.append(r)
print results

Python实际上有一个可以使用的内置线程池:

或者(使用
map
而不是
apply\u async
):


第二种情况在任务数量与池大小相同时是有用的,不是吗?它可以处理任意数量的任务,也可以处理任意数量的工作人员的
池<如果您希望对iterable的所有项运行函数,并返回每次调用的结果,则code>map
非常有用。如果有5个工作线程处理长度为100的iterable,则
池将针对所有100个项目调用函数,但同时运行的线程不得超过5个。输出将是一个长度为100的iterable,包含所有函数调用的结果值。@RafaelRios另一个注意事项是,由于,在Python中使用线程执行CPU限制的工作对性能没有好处。要绕过此限制,您需要通过模块使用多个进程。对于上面的示例,您可以使用来自多处理导入池的
而不是来自多处理.Pool导入线程池的
,来进行切换。其他一切都保持不变。
from multiprocessing.pool import ThreadPool

def foo(word, number):
    print (word * number)
    r[(word,number)] = number
    return number

words = ['hello', 'world', 'test', 'word', 'another test']
numbers = [1,2,3,4,5]
pool = ThreadPool(5)
results = []
for i in range(0, len(words)):
    results.append(pool.apply_async(foo, args=(words[i], numbers[i])))

pool.close()
pool.join()
results = [r.get() for r in results]
print results
from multiprocessing.pool import ThreadPool

def foo(word, number):
    print word*number
    return number

def starfoo(args):
    """ 

    We need this because map only supports calling functions with one arg. 
    We need to pass two args, so we use this little wrapper function to
    expand a zipped list of all our arguments.

    """    
    return foo(*args)

words = ['hello', 'world', 'test', 'word', 'another test']
numbers = [1,2,3,4,5]
pool = ThreadPool(5)
# We need to zip together the two lists because map only supports calling functions
# with one argument. In Python 3.3+, you can use starmap instead.
results = pool.map(starfoo, zip(words, numbers))
print results

pool.close()
pool.join()