Python 3.x 生产者-消费者消息共享在多处理中不起作用

Python 3.x 生产者-消费者消息共享在多处理中不起作用,python-3.x,multiprocessing,cv2,Python 3.x,Multiprocessing,Cv2,我正在尝试运行一个场景,其中我有一个制作人,它从网络摄像头捕获帧并将其放入队列。 然后,消费者从输入队列读取图像并进行一些处理,然后将o/p图像放入输出队列 问题是,从队列读取的使用者未阻塞。理想情况下,当它从队列中读取值时,大小总是常量128,这是错误的。我确信我正在排队的图像的大小要大得多 from __future__ import print_function import multiprocessing import time import logging import sys i

我正在尝试运行一个场景,其中我有一个制作人,它从网络摄像头捕获帧并将其放入队列。 然后,消费者从输入队列读取图像并进行一些处理,然后将o/p图像放入输出队列

问题是,从队列读取的使用者未阻塞。理想情况下,当它从队列中读取值时,大小总是常量128,这是错误的。我确信我正在排队的图像的大小要大得多

from __future__ import print_function

import multiprocessing
import time
import logging
import sys

import cv2


class Consumer(multiprocessing.Process):
    
    def __init__(self, incoming_q, outgoing_q):
        multiprocessing.Process.__init__(self)
        self.outgoing_q = outgoing_q
        self.incoming_q = incoming_q

    def run(self):
        proc_name = self.name
        print(f"{proc_name} - inside process_feed..starting")
        while True:
            #print(f"size of incoming_q=>{self.incoming_q.qsize()}")
            try:
                #print(f"{proc_name} - size of B incoming_q=>{self.incoming_q.qsize()}")
                image_np = self.incoming_q.get(True)
                size_of_img = sys.getsizeof(image_np)
                #print(f"{proc_name} - size of A incoming_q=>{self.incoming_q.qsize()}")
                if size_of_img > 128:
                    print(f"{proc_name} - size image=>{size_of_img}")
                    time.sleep(1)
                    self.outgoing_q.put_nowait(image_np)
            except:
                pass
        print("inside process_feed..ending")


class Producer(multiprocessing.Process):
    
    def __init__(self, incoming_q, outgoing_q):
        multiprocessing.Process.__init__(self)
        self.incoming_q = incoming_q
        self.outgoing_q = outgoing_q

    def run(self):
        proc_name = self.name
        print("inside capture_feed")
        stream = cv2.VideoCapture(0)
        try:
            counter = 0
            while True:
                counter += 1
                if counter == 1:
                    if not self.incoming_q.full():
                        (grabbed, image_np) = stream.read()
                        size_of_img = sys.getsizeof(image_np)
                        print(f"{proc_name}........B.......=>{self.incoming_q.qsize()}")
                        print(f"{proc_name} - size image=>{size_of_img}")
                        self.incoming_q.put(image_np)
                        print(f"{proc_name}........A.......=>{self.incoming_q.qsize()}")
                    counter = 0
                
                try:
                    image_np = self.outgoing_q.get_nowait()
                    logging.info("reading value for o/p")
                    cv2.imshow('object detection', image_np)
                except:
                    pass

                if cv2.waitKey(25) & 0xFF == ord('q'):
                    break
        finally:
            stream.release()
            cv2.destroyAllWindows()
        print("inside capture_feed..ending")
    

if __name__ == '__main__':
    logging.basicConfig(stream=sys.stdout, level=logging.INFO)
    stream = cv2.VideoCapture(0)
    
    incoming_q = multiprocessing.Queue(maxsize=100)
    outgoing_q = multiprocessing.Queue(maxsize=100)

    logging.info("before start of thread")
    
    max_process = 1
    processes = []    
    processes.append(Producer(incoming_q, outgoing_q))
    for i in range(max_process):
        p = Consumer(incoming_q, outgoing_q)
        p.daemon = True
        processes.append(p)
    logging.info("inside main thread..middle")

    for p in processes:
        p.start()

    logging.info("inside main thread..ending")
    logging.info("waiting in main thread too....")
    logging.info("waiting in main thread finished....")
    for p in processes:
        p.join()
    logging.info("inside main thread..ended")

我能够用我的方法找出问题所在。我错过了pickle(序列化)的整个概念

我将代码更改为在写入队列之前序列化numpy数组,并在读取后反序列化。代码开始按预期工作

另外,打印128作为np数组的大小也是可以的,我误解了这个数字

    def serialize_ndarray(arr:np.ndarray):
        serialized = pickle.dumps(arr)
        return serialized


    def deserialize_ndarray(string):
        data  = pickle.loads(string)
        return data