Iphone 如何使用AVCaptureSession仅捕获选定的相机帧?

Iphone 如何使用AVCaptureSession仅捕获选定的相机帧?,iphone,cocoa-touch,avfoundation,Iphone,Cocoa Touch,Avfoundation,我正在尝试使用AVCaptureSession从前置摄像头获取图像进行处理。到目前为止,每次有新帧可用时,我只需将其分配给一个变量,然后运行一个NSTimer,每十分之一秒检查一次是否有新帧,以及是否有新帧,并对其进行处理 我想得到一个帧,冻结相机,并得到下一帧时,我喜欢。比如[captureSession getNextFrame],你知道吗 下面是我的代码的一部分,尽管我不确定它会有多大帮助: - (void)startFeed { loopTimerIndex = 0; NS

我正在尝试使用AVCaptureSession从前置摄像头获取图像进行处理。到目前为止,每次有新帧可用时,我只需将其分配给一个变量,然后运行一个NSTimer,每十分之一秒检查一次是否有新帧,以及是否有新帧,并对其进行处理

我想得到一个帧,冻结相机,并得到下一帧时,我喜欢。比如[captureSession getNextFrame],你知道吗

下面是我的代码的一部分,尽管我不确定它会有多大帮助:

- (void)startFeed {

 loopTimerIndex = 0;

    NSArray *captureDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput 
                                          deviceInputWithDevice:[captureDevices objectAtIndex:1] 
                                          error:nil];

    AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
    captureOutput.minFrameDuration = CMTimeMake(1, 10);
    captureOutput.alwaysDiscardsLateVideoFrames = true;

    dispatch_queue_t queue;
    queue = dispatch_queue_create("cameraQueue", nil);

    [captureOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);

    NSString *key = (NSString *)kCVPixelBufferPixelFormatTypeKey;
    NSNumber *value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
    [captureOutput setVideoSettings:videoSettings];

    captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset = AVCaptureSessionPresetLow;
    [captureSession addInput:captureInput];
    [captureSession addOutput:captureOutput];

    imageView = [[UIImage alloc] init];

    [captureSession startRunning];

}

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
       fromConnection:(AVCaptureConnection *)connection {

 loopTimerIndex++;

    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer, 0);

    uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
    size_t width = CVPixelBufferGetWidth(imageBuffer);
    size_t height = CVPixelBufferGetHeight(imageBuffer);

    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
    CGImageRef newImage = CGBitmapContextCreateImage(newContext);

    CGContextRelease(newContext);
    CGColorSpaceRelease(colorSpace);

    imageView = [UIImage imageWithCGImage:newImage scale:1.0 orientation:UIImageOrientationLeftMirrored];
    [delegate updatePresentor:imageView];
    if(loopTimerIndex == 1) {
        [delegate feedStarted];
    }

    CGImageRelease(newImage);
    CVPixelBufferUnlockBaseAddress(imageBuffer, 0);

    [pool drain];

}

您不会主动轮询摄影机以获取帧,因为这不是捕获过程的架构。相反,如果只希望每十分之一秒显示一帧,而不是每1/30秒或更快显示一帧,则应忽略中间的帧

例如,您可以维护一个时间戳,以便在每次触发
-captureOutput:didOutputSampleBuffer:fromConnection:
时进行比较。如果从现在起时间戳大于或等于0.1秒,则处理并显示相机帧,并将时间戳重置为当前时间。否则,忽略帧