Objective c 在Mac上使用AVFoundation拍摄iSight图像

Objective c 在Mac上使用AVFoundation拍摄iSight图像,objective-c,macos,cocoa,avfoundation,Objective C,Macos,Cocoa,Avfoundation,我以前使用QTKit从Mac的iSight摄像头中捕获了一幅图像: - (NSError*)takePicture { BOOL success; NSError* error; captureSession = [QTCaptureSession new]; QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];

我以前使用QTKit从Mac的iSight摄像头中捕获了一幅图像:

- (NSError*)takePicture
{    
    BOOL success;
    NSError* error;

    captureSession = [QTCaptureSession new];
    QTCaptureDevice* device = [QTCaptureDevice defaultInputDeviceWithMediaType: QTMediaTypeVideo];

    success = [device open: &error];
    if (!success) { return error; }

    QTCaptureDeviceInput* captureDeviceInput = [[QTCaptureDeviceInput alloc] initWithDevice: device];
    success = [captureSession addInput: captureDeviceInput error: &error];
    if (!success) { return error; }

    QTCaptureDecompressedVideoOutput* captureVideoOutput = [QTCaptureDecompressedVideoOutput new];
    [captureVideoOutput setDelegate: self];

    success = [captureSession addOutput: captureVideoOutput error: &error];
    if (!success) { return error; }

    [captureSession startRunning];
    return nil;
}
- (void)captureOutput: (QTCaptureOutput*)captureOutput
  didOutputVideoFrame: (CVImageBufferRef)imageBuffer
     withSampleBuffer: (QTSampleBuffer*)sampleBuffer
       fromConnection: (QTCaptureConnection*)connection
{
    CVBufferRetain(imageBuffer);

    if (imageBuffer) {
        [captureSession removeOutput: captureOutput];
        [captureSession stopRunning];

        NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];

        _result = [[NSImage alloc] initWithSize: [imageRep size]];
        [_result addRepresentation: imageRep];

        CVBufferRelease(imageBuffer);

        _done = YES;
    }
}
然而,我今天发现QTKit已经被弃用了,所以我们现在必须使用AVFoundation。
有人能帮我把这段代码转换成它的AVFoundation等价物吗?似乎很多方法都有相同的名称,但同时,很多方法都不一样,我在这里完全不知所措。。。有什么帮助吗?

好的,我找到了解决办法!!这是:

- (void)takePicture
{
    NSError* error;
    AVCaptureDevice* device = [AVCaptureDevice defaultDeviceWithMediaType: AVMediaTypeVideo];
    AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice: device error: &error];
    if (!input) {
        _error = error;
        _done = YES;
        return;
    }

    AVCaptureStillImageOutput* output = [AVCaptureStillImageOutput new];
    [output setOutputSettings: @{(id)kCVPixelBufferPixelFormatTypeKey: @(k32BGRAPixelFormat)}];

    captureSession = [AVCaptureSession new];
    captureSession.sessionPreset = AVCaptureSessionPresetPhoto;

    [captureSession addInput: input];
    [captureSession addOutput: output];
    [captureSession startRunning];

    AVCaptureConnection* connection = [output connectionWithMediaType: AVMediaTypeVideo];
    [output captureStillImageAsynchronouslyFromConnection: connection completionHandler: ^(CMSampleBufferRef sampleBuffer, NSError* error) {
        if (error) {
            _error = error;
            _result = nil;
        }
        else {
            CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);

            if (imageBuffer) {
                CVBufferRetain(imageBuffer);

                NSCIImageRep* imageRep = [NSCIImageRep imageRepWithCIImage: [CIImage imageWithCVImageBuffer: imageBuffer]];

                _result = [[NSImage alloc] initWithSize: [imageRep size]];
                [_result addRepresentation: imageRep];

                CVBufferRelease(imageBuffer);
            }
        }

        _done = YES;
    }];
}

我希望这能帮助那些在做同样的事情时遇到任何问题的人。

这个来自苹果的链接还提供了从
QTKit
AVFoundation
的转换说明:非常感谢!但有一个问题:我的照片很暗。你有这个问题吗,或者找到了延长曝光时间的方法?这里也有同样的问题,非常黑暗