Warning: file_get_contents(/data/phpspider/zhask/data//catemap/2/python/324.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Python队列折衷处理_Python_Queue - Fatal编程技术网

Python队列折衷处理

Python队列折衷处理,python,queue,Python,Queue,我有一个队列,里面有100个数字,从1到100。首先,我有一个进程来填充队列,该队列打印队列填充。接下来,我有两个函数打印队列的当前值。我试图在进程之间打印队列的值之间进行权衡。这是我的密码: import multiprocessing as mp def fillQueue(lookup,q): list(map(q.put,lookup)) print('Queue filled') def printQueue1(q): while not q.empty()

我有一个队列,里面有100个数字,从1到100。首先,我有一个进程来填充队列,该队列打印
队列填充
。接下来,我有两个函数打印队列的当前值。我试图在进程之间打印队列的值之间进行权衡。这是我的密码:

import multiprocessing as mp

def fillQueue(lookup,q):
    list(map(q.put,lookup))
    print('Queue filled')

def printQueue1(q):
    while not q.empty():
        print('Process 1:', (q.get()))
    print('Process 1: Queue is empty!')

def printQueue2(q):
    while not q.empty():
        print('Process 2:', (q.get()))
    print('Process 2: Queue is empty!')

if __name__ == "__main__":
    pool = mp.Pool(processes=3)
    manager = mp.Manager()
    q = manager.Queue()

    lookup = []
    count = 1
    while count < 101:
        lookup.append(count)
        count = count + 1

    p2 = pool.apply_async(printQueue1,(q,))
    p3 = pool.apply_async(printQueue2,(q,))
    p1 = pool.apply_async(fillQueue,(lookup,q))

    pool.close()
    pool.join()
我想得到的是:

Queue filled
Process 1: 1
Process 2: 2
Process 1: 3
Process 2: 4
Process 1: 5

有什么办法可以做到这一点吗?每次我运行程序都会得到不同的结果,所以奇怪的事情发生了。谢谢

因此,
apply\u async
异步应用进程-这意味着您触发运行的3个进程都在同一时间运行,并且有点相互争斗

由于您不是决定性地触发这些进程,因此每次触发该进程时,它们的运行顺序可能会改变

我假设你想要:

  • 进程尝试访问之前要填充的队列
  • “工作”将在流程之间平均分配
  • 即使如此,除非以某种方式约束函数,否则它们将
    get()
    项的顺序仍然是相当随机的。如果你真的需要函数1只获得赔率,函数2只获得平分,并且它们的顺序很严格,那么你可能不需要多重处理

    import multiprocessing as mp
    
    
    def fillQueue(lookup, q):
        list(map(q.put, lookup))
        print('Queue filled')
    
    
    def printQueue(q, id):
        while not q.empty():
            print('Process {}: {}'.format(id, q.get()))
        print('Process {}: Queue is empty!'.format(id))
    
    
    if __name__ == "__main__":
        pool = mp.Pool(processes=3)
        manager = mp.Manager()
        q = manager.Queue()
    
        # no need to construct a list with a counter, we can just use the generator
        lookup = range(101)
    
        # do not fill the queue while processes are running, do it beforehand!
        fillQueue(lookup, q)
    
        # don't need different functions, since they are doing the same work
        # just fire off multiple copies of the same function
        p1 = pool.apply_async(printQueue, (q, 1,))
        p2 = pool.apply_async(printQueue, (q, 2,))
    
        pool.close()
        pool.join()
    
    示例输出:

    Queue filled
    Process 2: 0
    Process 2: 1
    Process 2: 2
    Process 2: 3
    Process 2: 4
    Process 2: 5
    Process 1: 6
    Process 2: 7
    Process 1: 8
    Process 2: 9
    Process 2: 10
    Process 1: 11
    

    您可以为每个进程创建一个
    Queue
    对象,作为一个“指挥棒”,指示哪个进程将下一个项目从主队列中出列,然后在每个辅助函数的主循环中,它应该首先尝试从自己的“指挥棒”队列中出列,然后再尝试从主队列中出列通过将项目排队到下一个进程的“指挥棒”队列中,将“指挥棒”传递到下一个进程。排队进程应通过将项目排队到应首先运行的进程的“指挥棒”队列中来启动出列进程。这是有效的,因为
    queue.get
    会阻塞,直到队列中有项目:

    import multiprocessing as mp
    import time
    
    def fillQueue(lookup, q, baton_first):
        list(map(q.put,lookup))
        print('Queue filled')
        baton_first.put(None)
    
    def printQueue(id, q, baton_self, baton_other):
        while True:
            baton_self.get()
            try:
                if q.empty():
                    break
                print('Process %s:' % id, (q.get()))
            # use finally to always pass on the baton whether the loop breaks or not
            finally:
                baton_other.put(None)
            time.sleep(1) # the actual work should be performed here
        print('Process %s: Queue is empty!' % id)
    
    if __name__ == "__main__":
        pool = mp.Pool(processes=3)
        manager = mp.Manager()
        q = manager.Queue()
        baton1 = manager.Queue()
        baton2 = manager.Queue()
    
        p2 = pool.apply_async(printQueue,(1, q, baton1, baton2))
        p3 = pool.apply_async(printQueue,(2, q, baton2, baton1))
        p1 = pool.apply_async(fillQueue, (list(range(1, 11)), q, baton1))
    
        pool.close()
        pool.join()
    
    这将产生:

    Queue filled
    Process 1: 1
    Process 2: 2
    Process 1: 3
    Process 2: 4
    Process 1: 5
    Process 2: 6
    Process 1: 7
    Process 2: 8
    Process 1: 9
    Process 2: 10
    Process 1: Queue is empty!
    Process 2: Queue is empty!
    

    队列项在进程之间的分布不均匀。无论出于何种原因,进程1似乎要快一点。这完全可以。如果希望均匀分布,可能需要额外的同步(锁)但这将使使用两个打印过程而不是一个实际上毫无意义。您对队列工作原理的解释非常有用。谢谢您!谢谢您,这就是我试图做的!问题,当我将
    计数增加到100时,“队列已满”“消息在处理开始后出现。那么,如果队列尚未填满,它将如何处理队列?或者“队列已满”消息显示得很晚?很高兴能为您提供帮助。这是因为所有3个进程同时运行,因此
    print('Queue filled')
    语句不一定在队列退出队列之前运行。如果确实希望在任何出列开始之前始终打印该消息,则可以使该进程而不是主进程执行
    baton1.put(None)
    语句以启动出列。我更新了我的答案以反映这一变化。这是有道理的。再次感谢您的帮助和解释!
    Queue filled
    Process 1: 1
    Process 2: 2
    Process 1: 3
    Process 2: 4
    Process 1: 5
    Process 2: 6
    Process 1: 7
    Process 2: 8
    Process 1: 9
    Process 2: 10
    Process 1: Queue is empty!
    Process 2: Queue is empty!