Python 如何在多处理参数中使用函数指针

Python 如何在多处理参数中使用函数指针,python,function,process,point,Python,Function,Process,Point,我有几个使用相同参数的函数,如下所示 def cat(time , dist) return random.randint(1, 400) * time + random.randint(1, 5) * dist def dog(time , dist) return random.randint(1, 300) * time + random.randint(1, 7) * dist def rabbit(time , dist) return random.ran

我有几个使用相同参数的函数,如下所示

def cat(time , dist)
    return random.randint(1, 400) * time + random.randint(1, 5) * dist

def dog(time , dist)
    return random.randint(1, 300) * time + random.randint(1, 7) * dist

def rabbit(time , dist)
    return random.randint(1, 200) * time + random.randint(1, 3) * dist

def turtle(time , dist)
    return random.randint(1, 100) * time + random.randint(1, 1) * dist


if __name__ == '__main__':
    FunArray = {
        1:cat
        2:dog
        3:rabbit
        4:turtle
    }
    pool = multiprocessing.Pool(processes=2)
    q=10
    for i in xrange(1,4):
        workers = pool.apply_async(FunArray[i], args=(i, q))
    pool.close()
    pool.join()

我只想同时运行两个进程,我想使用一个函数指针来传递进程的函数名。然而,该计划不起作用

首先有一些语法错误:

  • 函数定义头的结尾

    def cat(time , dist):
                        ^
    
  • 在字典文字的每个项目之后:

    1:cat,
         ^
    
  • 如果uuuu name uuuu==“uuuu main”
    块应该缩进

没有导入
随机
多处理

FuncArray
实际上是一个字典

xrange(1,4)
生成1,2,3。(4不包括在内)。如果你想得到1,2,3,4,你应该使用
xrange(1,5)
。但是,我宁愿直接用
enumerate
迭代列表

您需要保存workers引用,以便稍后返回结果


import random
import multiprocessing

def cat(time , dist):
    return random.randint(1, 400) * time + random.randint(1, 5) * dist

def dog(time , dist):
    return random.randint(1, 300) * time + random.randint(1, 7) * dist

def rabbit(time , dist):
    return random.randint(1, 200) * time + random.randint(1, 3) * dist

def turtle(time , dist):
    return random.randint(1, 100) * time + random.randint(1, 1) * dist


if __name__ == '__main__':
    funcs = [cat, dog, rabbit, turtle]
    pool = multiprocessing.Pool(processes=2)
    q=10
    workers = []
    for i, func in enumerate(funcs):
        worker = pool.apply_async(func, args=(i, q))
        workers.append(worker)
    for worker in workers:
        print worker.get()
    pool.close()
    pool.join()