Iphone 对焦(自动对焦)在相机中不工作(AVFoundation AVCaptureSession)
我使用标准的AVFoundation类来捕获视频并显示预览() 这是我的密码:Iphone 对焦(自动对焦)在相机中不工作(AVFoundation AVCaptureSession),iphone,camera,avfoundation,avcapturesession,autofocus,Iphone,Camera,Avfoundation,Avcapturesession,Autofocus,我使用标准的AVFoundation类来捕获视频并显示预览() 这是我的密码: - (void)setupCaptureSession { NSError *error = nil; [self setCaptureSession: [[AVCaptureSession alloc] init]]; self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; device
- (void)setupCaptureSession {
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]) {
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
[device unlockForConfiguration];
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
}
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);
[[self captureSession] startRunning];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
captureVideoPreviewLayer.frame = previewLayer.bounds;
[previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
[previewLayer setHidden:NO];
mutex = YES;
}
// Delegate routine that is called when a sample buffer was written
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
if (mutex && ![device isAdjustingFocus] && ![device isAdjustingExposure] && ![device isAdjustingWhiteBalance]) {
// something
}
}
// Create a UIImage from sample buffer data
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer {
// Get a CMSampleBuffer's Core Video image buffer for the media data
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0);
// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
// Create a bitmap graphics context with the sample buffer data
CGContextRef context = CGBitmapContextCreate(baseAddress, width, height, 8,
bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// Create a Quartz image from the pixel data in the bitmap graphics context
CGImageRef quartzImage = CGBitmapContextCreateImage(context);
// Unlock the pixel buffer
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
// Free up the context and color space
CGContextRelease(context);
CGColorSpaceRelease(colorSpace);
// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];
// Release the Quartz image
CGImageRelease(quartzImage);
return (image);
}
一切正常,但有时会出现一些问题:
- 相机焦距不工作-它是随机的,有时有效,有时无效。我在iphone4和3GS上试过不同的设备。我试着用谷歌搜索,但没有结果。人们只提到损坏的设备,但我检查了3部iPhone4和1部iPhone3GS。问题无处不在
- 摄像机加载时间很长。我正在使用ScannerKit API,出于同样的原因,它也在使用camera,它的加载速度大约是我的实现的两倍
你知道什么是问题吗?第一个问题肯定更重要。我注意到视频预设比照片预设需要更长的初始化时间 你是在录像还是拍照 我注意到您有一个中等质量的设置,但是使用32BGRA,将捕获模式设置为Photo并在捕获后对图像进行下采样可能会更好。同时设置AVVideoCodecJPEG而不是32BGRA
[device setOutputSettings:[NSDictionary dictionaryWithObject:AVVideoCodecJPEG forKey:AVVideoCodecKey]];
而不是:
[device setOutputSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCMPixelFormat_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
您可能还希望注册通知,以便
主题区域更改监控并强制重新聚焦,如果您在任意点将焦点模式更改为AVCaptureFocusModeAutoFocus
您可能还需要添加代码来手动设置自动对焦并将其重置为自动,因为有时这是必需的
我已经修改了代码,设置了关注的焦点,并将相机配置错误输出记录到委托方法中
- (void)setupCaptureSession {
NSError *error = nil;
[self setCaptureSession: [[AVCaptureSession alloc] init]];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if ([device isFocusModeSupported:AVCaptureFocusModeContinuousAutoFocus] && [device lockForConfiguration:&error]){
[device setFocusMode:AVCaptureFocusModeContinuousAutoFocus];
if ([device isFocusPointOfInterestSupported])
[device setFocusPointOfInterest:CGPointMake(0.5f,0.5f)];
[device unlockForConfiguration];
}else {
if ([[self delegate]
respondsToSelector:@selector(captureManager:didFailWithError:)]) {
[[self delegate] captureManager:self didFailWithError:error];
}
}
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error];
if (!input) {
// TODO: Obsługa błędu, gdy nie uda się utworzyć wejścia
}
[[self captureSession] addInput:input];
AVCaptureVideoDataOutput *output = [[[AVCaptureVideoDataOutput alloc] init] autorelease];
[[self captureSession] addOutput:output];
dispatch_queue_t queue = dispatch_queue_create("myQueue", NULL);
[output setSampleBufferDelegate:self queue:queue];
dispatch_release(queue);
output.videoSettings =
[NSDictionary dictionaryWithObject:
[NSNumber numberWithInt:kCVPixelFormatType_32BGRA]
forKey:(id)kCVPixelBufferPixelFormatTypeKey];
output.minFrameDuration = CMTimeMake(1, 15);
[[self captureSession] startRunning];
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
captureVideoPreviewLayer.frame = previewLayer.bounds;
[previewLayer.layer insertSublayer:captureVideoPreviewLayer atIndex:0];
[previewLayer setHidden:NO];
mutex = YES;
}
这是一个老问题,但无论如何,这可能会让人少受几个小时的挫折。在调用
setFocusMode
之前,设置关注点非常重要,否则您的相机会将焦点设置为上一个焦点。将setFocusMode
视为COMMIT。这同样适用于setExposureMode
Apple提供的AVCam示例完全错误且已损坏。创建视频输入后,请尝试设置聚焦模式。这就是你的代码和我的代码之间的唯一区别。您可能还想检查它是否处于您在代码中稍后某个点设置的模式。仍然一样,还有其他想法吗?我用我的应用程序到处玩,我重现了您的问题。我没有机会调试这个问题。首先,我会尝试用WWDC 2010的AVCamDemo复制它。非常真实和有用:-)谢谢,节省了很多时间)