Ios 在使用AVFoundation时,不知道什么对象实际包含捕获的图像

Ios 在使用AVFoundation时,不知道什么对象实际包含捕获的图像,ios,objective-c,uiview,uiimage,avfoundation,Ios,Objective C,Uiview,Uiimage,Avfoundation,我有一个使用AVFoundation的拍照应用程序。到目前为止,一切正常 然而,有一件事让我很困惑,那就是,拍摄的图像实际上包含在什么对象中 我已经记录了所有的对象和它们的一些属性,但我仍然不知道捕获的图像包含在哪里 以下是我设置捕获会话的代码: self.session =[[AVCaptureSession alloc]init]; [self.session setSessionPreset:AVCaptureSessionPresetPhoto]; self.inputDe

我有一个使用AVFoundation的拍照应用程序。到目前为止,一切正常

然而,有一件事让我很困惑,那就是,拍摄的图像实际上包含在什么对象中

我已经记录了所有的对象和它们的一些属性,但我仍然不知道捕获的图像包含在哪里

以下是我设置捕获会话的代码:

self.session =[[AVCaptureSession alloc]init];


 [self.session setSessionPreset:AVCaptureSessionPresetPhoto];



 self.inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];


    NSError *error;


     self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.inputDevice error:&error];




     if([self.session canAddInput:self.deviceInput])
    [self.session addInput:self.deviceInput];



  self.previewLayer = [[AVCaptureVideoPreviewLayer alloc]initWithSession:self.session];


  self.rootLayer = [[self view]layer];


  [self.rootLayer setMasksToBounds:YES];



[self.previewLayer setFrame:CGRectMake(0, 0, self.rootLayer.bounds.size.width, self.rootLayer.bounds.size.height)];


[self.previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];



[self.rootLayer insertSublayer:self.previewLayer atIndex:0];


self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];


[self.session addOutput:self.stillImageOutput];

[self.session startRunning];


}
下面是我在用户按下捕获按钮时捕获静止图像的代码:

-(IBAction)stillImageCapture {




AVCaptureConnection *videoConnection = nil;

videoConnection.videoOrientation = AVCaptureVideoOrientationPortrait;


for (AVCaptureConnection *connection in self.stillImageOutput.connections){
    for (AVCaptureInputPort *port in [connection inputPorts]){

        if ([[port mediaType] isEqual:AVMediaTypeVideo]){

            videoConnection = connection;



            break;
        }
    }
    if (videoConnection) {
        break;
    }
}




[self.stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {

    [self.session stopRunning];


}

 ];}
当用户按下捕获按钮并执行上述代码时,捕获的图像会成功地显示在iPhone屏幕上,但我无法确定哪个对象实际持有捕获的图像

感谢您的帮助。

这是实际包含图像的内容

在您的
captureStillImageAsynchronouslyFromConnection
完成处理程序中,您将需要如下内容:

NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
UIImage* capturedImage = [[UIImage alloc] initWithData:imageData];
我的工作是执行信息技术:

- (void)captureStillImage
{
    @try {
        AVCaptureConnection *videoConnection = nil;
        for (AVCaptureConnection *connection in _stillImageOutput.connections){
            for (AVCaptureInputPort *port in [connection inputPorts]){

                if ([[port mediaType] isEqual:AVMediaTypeVideo]){

                    videoConnection = connection;
                    break;
                }
            }
            if (videoConnection) {
                break;
            }
        }
        NSLog(@"About to request a capture from: %@", [self stillImageOutput]);
        [[self stillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection
                                                             completionHandler:^(CMSampleBufferRef imageSampleBuffer, NSError *error) {

                                                                 // This is here for when we need to implement Exif stuff. 
                                                                 //CFDictionaryRef exifAttachments = CMGetAttachment(imageSampleBuffer, kCGImagePropertyExifDictionary, NULL);

                                                                 NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];

                                                                 // Create a UIImage from the sample buffer data
                                                                 _capturedImage = [[UIImage alloc] initWithData:imageData];


                                                                 BOOL autoSave = YES;
                                                                 if (autoSave)
                                                                 {
                                                                     UIImageWriteToSavedPhotosAlbum(_capturedImage, self, @selector(image:didFinishSavingWithError:contextInfo:), nil);
                                                                 }

                                                             }];
    }
    @catch (NSException *exception) {
        NSlog(@"ERROR: Unable to capture still image from AVFoundation camera: %@", exception);
    }
}

谢谢你对我们的帮助!