Warning: file_get_contents(/data/phpspider/zhask/data//catemap/9/ios/104.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios AVCaptureStillImageOutput从不调用完成处理程序_Ios_Camera_Avfoundation_Video Capture - Fatal编程技术网

Ios AVCaptureStillImageOutput从不调用完成处理程序

Ios AVCaptureStillImageOutput从不调用完成处理程序,ios,camera,avfoundation,video-capture,Ios,Camera,Avfoundation,Video Capture,下面的代码不起作用。怎么了 AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]; AVCaptureDeviceInput * videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil]; AVCaptureSession * captureSession = [

下面的代码不起作用。怎么了

AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput * videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
captureSession.sessionPreset = AVCaptureSessionPresetMedium;
if (![captureSession canAddInput:videoInput])
    NSLog(@"Can't add input");
[captureSession addInput:videoInput];

self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[self.stillImageOutput setOutputSettings:@{AVVideoCodecKey:AVVideoCodecJPEG}];
if (![captureSession canAddOutput:videoInput])
    NSLog(@"Can't add output");
[captureSession addOutput:self.stillImageOutput];

[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput.connections lastObject]
                                                   completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
                                                   {
                                                       NSLog(@"!!!");

                                                       if (imageDataSampleBuffer == NULL)
                                                       {
                                                           NSLog(@"%@", error);
                                                           return;
                                                       }

                                                       NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                       UIImage *image = [[UIImage alloc] initWithData:imageData];
                                                       self.imageView.image = image;
                                                   }];

// Creating preview layer
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
self.previewLayer.frame = self.view.layer.bounds;
[self.view.layer addSublayer:self.previewLayer];

[captureSession startRunning];
AVCaptureVideoPreviewLayer工作得很好,但AVCaptureStillImageOutput根本不调用完成处理程序…

这很好:

- (void)viewDidLoad
{
    [super viewDidLoad];

    AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDeviceInput * videoInput = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
    AVCaptureSession * captureSession = [[AVCaptureSession alloc] init];
    captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    [captureSession addInput:videoInput];
    [captureSession addOutput:self.stillImageOutput];
    [captureSession startRunning];

    // Creating preview layer
    self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:captureSession];
    self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
    self.previewLayer.frame = self.view.layer.bounds;
    [self.view.layer insertSublayer:self.previewLayer atIndex:0];
}

- (void)timerFired:(NSTimer *)timer
{
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:[self.stillImageOutput.connections lastObject]
                                                       completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
                                                       {
                                                           NSLog(@"!!!");

                                                           if (imageDataSampleBuffer == NULL)
                                                               NSLog(@"%@", error);

                                                           NSData * imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                                                           UIImage * image = [[UIImage alloc] initWithData:imageData];
                                                           self.imageView.image = image;
                                                       }];
}

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];

    [NSTimer scheduledTimerWithTimeInterval:0.5 target:self selector:@selector(timerFired:) userInfo:nil repeats:YES];
}

您需要用一种方法设置和启动会话, 然后使用单独的捕获方法:

/////////////////////////////////////////////////
////
//// Utility to find front camera
////
/////////////////////////////////////////////////
-(AVCaptureDevice *) frontFacingCameraIfAvailable{

    NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *captureDevice = nil;

   for (AVCaptureDevice *device in videoDevices){

        if (device.position == AVCaptureDevicePositionFront){

            captureDevice = device;
            break;
        }
    }

    //  couldn't find one on the front, so just get the default video device.
    if (!captureDevice){

        captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    }

    return captureDevice;
}

/////////////////////////////////////////////////
////
//// Setup Session, attach Video Preview Layer
//// and Capture Device, start running session
////
/////////////////////////////////////////////////
-(void) setupCaptureSession {
    AVCaptureSession *session = [[AVCaptureSession alloc] init];
    session.sessionPreset = AVCaptureSessionPresetMedium;

    AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer    alloc] initWithSession:session];
    [self.view.layer addSublayer:captureVideoPreviewLayer];

    NSError *error = nil;
    AVCaptureDevice *device = [self frontFacingCameraIfAvailable];
    AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (!input) {
        // Handle the error appropriately.
        NSLog(@"ERROR: trying to open camera: %@", error);
    }
    [session addInput:input];

    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys: AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:outputSettings];

    [session addOutput:self.stillImageOutput];

    [session startRunning];
}


/////////////////////////////////////////////////
////
//// Method to capture Still Image from 
//// Video Preview Layer
////
/////////////////////////////////////////////////
-(void) captureNow {
    AVCaptureConnection *videoConnection = nil;
    for (AVCaptureConnection *connection in self.stillImageOutput.connections) {
        for (AVCaptureInputPort *port in [connection inputPorts]) {
            if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
                videoConnection = connection;
                break;
            }
        }
        if (videoConnection) { break; }
    }

    NSLog(@"about to request a capture from: %@", self.stillImageOutput);
    __weak typeof(self) weakSelf = self;
    [self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

         NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
         UIImage *image = [[UIImage alloc] initWithData:imageData];

         [weakSelf displayImage:image];
     }];
}

“以单独的方法执行”解决此问题的唯一原因是,它具有在控制器中保留AVCaptureSession的副作用。我认为AVCaptureStillImageOutput中有一个bug,使得它在自动引用计数模式下无法跟踪AVCaptureSession。啊,说得对。是的,一位同事&我正试图让它像你上周五尝试的那样工作,但没有成功。我一直到凌晨,发现最好单独初始化AVCaptureSession。祝你好运!我正在尝试在swift中执行相同的操作,但它不起作用…..是否有原因使其不起作用..我有相同的代码….在我的示例中,stillImageOutput.connections为nil,因为我分配了AVCaptureSession和AVCaptureVideoPreviewLayer两次,所以只添加了检查会话==nil,然后只执行[[AvCaptureSillImageOutput alloc]init]和所有静态图像输出预设