Warning: file_get_contents(/data/phpspider/zhask/data//catemap/5/objective-c/24.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Objective c AVCaptureOutput委托_Objective C_Xcode_Avcapturesession - Fatal编程技术网

Objective c AVCaptureOutput委托

Objective c AVCaptureOutput委托,objective-c,xcode,avcapturesession,Objective C,Xcode,Avcapturesession,我正在创建一个使用-(void)captureOutput:(AVCaptureOutput*)captureOutput的应用程序 didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)连接{}函数,但未调用该函数。为了进一步解释,该应用程序正在使用中的代码创建一个视频录制应用程序。当我在xCode中运行教程的代码时,它运行了上面的函数,但当我将它复制到我的应用程序中

我正在创建一个使用
-(void)captureOutput:(AVCaptureOutput*)captureOutput的应用程序
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection*)连接{}
函数,但未调用该函数。为了进一步解释,该应用程序正在使用中的代码创建一个视频录制应用程序。当我在xCode中运行教程的代码时,它运行了上面的函数,但当我将它复制到我的应用程序中时,无论如何都没有修改它,它从未被调用

下面是使用的代码:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    NSError *error = nil;
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    if ([[UIDevice currentDevice] userInterfaceIdiom] == UIUserInterfaceIdiomPhone){
        [session setSessionPreset:AVCaptureSessionPreset640x480];
    } else {
        [session setSessionPreset:AVCaptureSessionPresetPhoto];
    }
    // Select a video device, make an input
    AVCaptureDevice *device;
    AVCaptureDevicePosition desiredPosition = AVCaptureDevicePositionFront;
    // find the front facing camera
    for (AVCaptureDevice *d in [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]) {
        if ([d position] == desiredPosition) {
            device = d;
            isUsingFrontFacingCamera = YES;
            break;
        }
    }
    // fall back to the default camera.
    if( nil == device )
    {
        isUsingFrontFacingCamera = NO;
        device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    }
    // get the input device
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if( !error ) {

        // add the input to the session
        if ( [session canAddInput:deviceInput] ){
            [session addInput:deviceInput];
        }

        previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
        previewLayer.backgroundColor = [[UIColor blackColor] CGColor];
        previewLayer.videoGravity = AVLayerVideoGravityResizeAspect;

        CALayer *rootLayer = [previewView layer];
        [rootLayer setMasksToBounds:YES];
        [previewLayer setFrame:[rootLayer bounds]];
        [rootLayer addSublayer:previewLayer];
        [session startRunning];

    }

    session = nil;

    if (error) {
        UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:
                                  [NSString stringWithFormat:@"Failed with error %d", (int)[error code]]
                                                            message:[error localizedDescription]
                                                           delegate:nil
                                                  cancelButtonTitle:@"Dismiss"
                                                  otherButtonTitles:nil];
        [alertView show];
        [self teardownAVCapture];
    }

    NSDictionary *detectorOptions = [[NSDictionary alloc] initWithObjectsAndKeys:CIDetectorAccuracyLow, CIDetectorAccuracy, nil];
    faceDetector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil options:detectorOptions];

    // Make a video data output
    videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
    // we want BGRA, both CoreGraphics and OpenGL work well with 'BGRA'
    NSDictionary *rgbOutputSettings = [NSDictionary dictionaryWithObject:
                                       [NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    [videoDataOutput setVideoSettings:rgbOutputSettings];
    [videoDataOutput setAlwaysDiscardsLateVideoFrames:YES]; // discard if the data output queue is blocked

    // create a serial dispatch queue used for the sample buffer delegate
    // a serial dispatch queue must be used to guarantee that video frames will be delivered in order
    // see the header doc for setSampleBufferDelegate:queue: for more information

    videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
    [videoDataOutput setSampleBufferDelegate:self queue:videoDataOutputQueue];
    if ( [session canAddOutput:videoDataOutput] ){
        [session addOutput:videoDataOutput];
    }

    // get the output for doing face detection.
    [[videoDataOutput connectionWithMediaType:AVMediaTypeVideo] setEnabled:YES];

    //[self setupCaptureSession];
}

好吧,我想我知道问题出在哪里了。甚至在设置
视频数据输出之前,您就已经有了
[session startRunning]
。没有视频数据输出的会话……嗯,不会调用
AVCaptureOutput
委托。

好的,我想我知道问题出在哪里了。甚至在设置
视频数据输出之前,您就已经有了
[session startRunning]
。没有视频数据输出的会话…嗯,将不会调用
AVCaptureOutput
委托。

您是否符合
AVCaptureVideoDataOutputSampleBufferDelegate
?你设置了委托吗?@YesterBunny这是我设置的全部
@interface SquareCamViewController:UIViewController
我一直在阅读教程中的代码,以验证我没有遗漏任何内容,但我没有发现遗漏任何内容。请在代码中搜索
setSampleBufferDelegate
。你有那套吗?@YesterBunny我刚查过密码,它就在那里。现在我更不明白为什么没有人叫它。嗯……奇怪。如果您有代码,请随时在此处发布一些帮助您的信息。您是否遵守了
AVCaptureVideoDataOutputSampleBufferDelegate
?你设置了委托吗?@YesterBunny这是我设置的全部
@interface SquareCamViewController:UIViewController
我一直在阅读教程中的代码,以验证我没有遗漏任何内容,但我没有发现遗漏任何内容。请在代码中搜索
setSampleBufferDelegate
。你有那套吗?@YesterBunny我刚查过密码,它就在那里。现在我更不明白为什么没有人叫它。嗯……奇怪。如果你有代码,请随时在这里发布一些帮助你的人。谢谢!!这帮了大忙。所有问题现在都已解决。:)非常感谢。这帮了大忙。所有问题现在都已解决。:)