Iphone 在UIView中显示相机馈送

Iphone 在UIView中显示相机馈送,iphone,ios,objective-c,Iphone,Ios,Objective C,我正在研究如何将前置摄像头视频馈送显示到UIView中,类似于FaceTime。我知道使用AVCaptureVideoPreviewLayer很容易做到这一点。有没有其他不使用AVCaptureVideoPreviewLayer的方法 这仅仅是为了教育目的 更新: 我发现这可以通过UIImagePickerController实现 UIImagePickerController *cameraView = [[UIImagePickerController alloc] init]; camer

我正在研究如何将前置摄像头视频馈送显示到UIView中,类似于FaceTime。我知道使用AVCaptureVideoPreviewLayer很容易做到这一点。有没有其他不使用AVCaptureVideoPreviewLayer的方法

这仅仅是为了教育目的

更新: 我发现这可以通过UIImagePickerController实现

UIImagePickerController *cameraView = [[UIImagePickerController alloc] init];
cameraView.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraView.showsCameraControls = NO;
[self.view addSubview:cameraView.view];
[cameraView viewWillAppear:YES]; 
[cameraView viewDidAppear:YES];

如果您试图操纵像素,可以将以下方法放入要作为委托分配给AVCaptureVideoDataOutputSampleBufferDelegate的类中:

-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
  CVImageBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);

  if(CVPixelBufferLockBaseAddress(pb, 0))  //zero is success
    NSLog(@"Error");

    size_t bufferHeight = CVPixelBufferGetHeight(pb);
    size_t bufferWidth = CVPixelBufferGetWidth(pb);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pb);

    unsigned char* rowBase= CVPixelBufferGetBaseAddress(pb);


    CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
      NSLog(@"Error");

    // Create a bitmap graphics context with the sample buffer data.
    CGContextRef context= CGBitmapContextCreate(rowBase,bufferWidth,bufferHeight, 8,bytesPerRow, colorSpace,  kCGImageAlphaNone);

    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);

    UIImage *currentImage=[UIImage imageWithCGImage:quartzImage];

    // Free up the context and color space
    CFRelease(quartzImage);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    if(CVPixelBufferUnlockBaseAddress(pb, 0 )) //zero is success
    NSLog(@"Error");
}
然后将该图像连接到视图控制器中的UIImageView。
查看KCGIMAGEALPHONE标志。这将取决于您正在执行的操作。

如果您试图操作像素,可以将以下方法放入要作为委托分配给AVCaptureVideoDataOutputSampleBufferDelegate的类中:

-(void) captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
  CVImageBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);

  if(CVPixelBufferLockBaseAddress(pb, 0))  //zero is success
    NSLog(@"Error");

    size_t bufferHeight = CVPixelBufferGetHeight(pb);
    size_t bufferWidth = CVPixelBufferGetWidth(pb);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(pb);

    unsigned char* rowBase= CVPixelBufferGetBaseAddress(pb);


    CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
      NSLog(@"Error");

    // Create a bitmap graphics context with the sample buffer data.
    CGContextRef context= CGBitmapContextCreate(rowBase,bufferWidth,bufferHeight, 8,bytesPerRow, colorSpace,  kCGImageAlphaNone);

    // Create a Quartz image from the pixel data in the bitmap graphics context
    CGImageRef quartzImage = CGBitmapContextCreateImage(context);

    UIImage *currentImage=[UIImage imageWithCGImage:quartzImage];

    // Free up the context and color space
    CFRelease(quartzImage);
    CGContextRelease(context);
    CGColorSpaceRelease(colorSpace);

    if(CVPixelBufferUnlockBaseAddress(pb, 0 )) //zero is success
    NSLog(@"Error");
}
然后将该图像连接到视图控制器中的UIImageView。
查看KCGIMAGEALPHONE标志。这将取决于您正在做什么。

为什么您不想使用AVCaptureVideoPreviewLayer?我只是想评估所有其他选项。为什么您不想使用AVCaptureVideoPreviewLayer?我只是想评估所有其他选项。不想操纵像素。只是在寻找替代方案。好的,太好了。如果这有效,请不要忘记标记为答案。Thanks不想操纵像素。只是在寻找替代方案。好的,太好了。如果这有效,请不要忘记标记为答案。谢谢