Macos AVFoundation:如何在sampleBuffer(AVCaptureSessionPreset)中调整帧大小并不影响它

Macos AVFoundation:如何在sampleBuffer(AVCaptureSessionPreset)中调整帧大小并不影响它,macos,avfoundation,video-capture,video-processing,Macos,Avfoundation,Video Capture,Video Processing,我正试图从我的macBook相机中捕获视频帧,并对其进行动态处理(用于以后的人脸检测)。为了减少内存使用,我想将捕获分辨率从预设值1200x720降低到640x480 以下是设置捕获会话的代码: _session = [[AVCaptureSession alloc] init]; if ([_session canSetSessionPreset:AVCaptureSessionPreset640x480]){ [_session setSessionPreset:A

我正试图从我的macBook相机中捕获视频帧,并对其进行动态处理(用于以后的人脸检测)。为了减少内存使用,我想将捕获分辨率从预设值1200x720降低到640x480

以下是设置捕获会话的代码:

_session = [[AVCaptureSession alloc] init];

    if ([_session canSetSessionPreset:AVCaptureSessionPreset640x480]){
        [_session setSessionPreset:AVCaptureSessionPreset640x480];
        NSLog(@"resolution preset changed");
    }

    // configure input
    _camera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    _deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:_camera error:nil];


    // configure output
    _videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    NSDictionary* newSettings = @{ (NSString*) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    _videoOutput.videoSettings = newSettings;
    //discard if the data output queue is blocked
    [_videoOutput setAlwaysDiscardsLateVideoFrames:YES];

    // process frames on another queue
    dispatch_queue_t videoDataOutputQueue;
    videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
    [_videoOutput setSampleBufferDelegate:videoBufferDelegate queue:videoDataOutputQueue];

    [_session addInput:_deviceInput];
    [_session addOutput:_videoOutput];

    [_session startRunning];
在此之后,会话将进行适当设置,正确记录“分辨率预设已更改”,并将视频数据转发给另一个队列中的代理进行处理。当我检查
session.sessionPreset
时,它显示预设为
AVCaptureSessionPreset640x480

现在在代表中:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
        didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
        fromConnection:(AVCaptureConnection *)connection
{

    //get the image from buffer, transform it to CIImage
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    self.image = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
检查
self.image.extent.size
时,会显示1200x720的错误大小,就好像我以前没有更改预设值一样。即使在检查方法的参数
sampleBuffer
时,它也会将维度显示为1200x720


现在我浏览了几小时的互联网和苹果参考资料,但没有找到解决方案。我希望你能救我

我似乎自己找到了一个解决方案(或者至少找到了一个解决办法)。注释掉以下几行会导致缓冲区分辨率的预期变化:

    NSDictionary* settings = @{ (NSString*) kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_32BGRA)};
    _videoOutput.videoSettings = settings;
我的假设是,设置这些压缩设置会导致AVCaptureSessionPreset被覆盖。但是,我发现不完全清楚为什么会出现这种情况(压缩设置不应该对分辨率设置产生影响,对吗?)