iOS 5-AVCaptureDevice设置焦点和焦点模式可冻结实时相机图片
从iOS 4开始,我使用以下方法设置焦点:iOS 5-AVCaptureDevice设置焦点和焦点模式可冻结实时相机图片,ios,ios5,avcapturedevice,avcapture,Ios,Ios5,Avcapturedevice,Avcapture,从iOS 4开始,我使用以下方法设置焦点: - (void) focusAtPoint:(CGPoint)point { AVCaptureDevice *device = [[self captureInput] device]; NSError *error; if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] && [device isFocusPoint
- (void) focusAtPoint:(CGPoint)point
{
AVCaptureDevice *device = [[self captureInput] device];
NSError *error;
if ([device isFocusModeSupported:AVCaptureFocusModeAutoFocus] &&
[device isFocusPointOfInterestSupported])
{
if ([device lockForConfiguration:&error]) {
[device setFocusPointOfInterest:point];
[device setFocusMode:AVCaptureFocusModeAutoFocus];
[device unlockForConfiguration];
} else {
NSLog(@"Error: %@", error);
}
}
}
在iOS 4设备上,这项工作没有任何问题。但在iOS 5上,实时摄像头的视频会冻结,几秒钟后会完全变黑。没有引发异常或错误
如果注释掉setFocusPointOfInterest或setFocusMode,则不会发生错误。因此,两者的结合将导致这种行为。您为setFocusPointOfInterest:函数指定的点不正确。这就是它崩溃的原因 将此方法添加到程序中,并使用此函数返回的值
- (CGPoint)convertToPointOfInterestFromViewCoordinates:(CGPoint)viewCoordinates
{
CGPoint pointOfInterest = CGPointMake(.5f, .5f);
CGSize frameSize = [[self videoPreviewView] frame].size;
AVCaptureVideoPreviewLayer *videoPreviewLayer = [self prevLayer];
if ([[self prevLayer] isMirrored]) {
viewCoordinates.x = frameSize.width - viewCoordinates.x;
}
if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResize] ) {
pointOfInterest = CGPointMake(viewCoordinates.y / frameSize.height, 1.f - (viewCoordinates.x / frameSize.width));
} else {
CGRect cleanAperture;
for (AVCaptureInputPort *port in [[[[self captureSession] inputs] lastObject] ports]) {
if ([port mediaType] == AVMediaTypeVideo) {
cleanAperture = CMVideoFormatDescriptionGetCleanAperture([port formatDescription], YES);
CGSize apertureSize = cleanAperture.size;
CGPoint point = viewCoordinates;
CGFloat apertureRatio = apertureSize.height / apertureSize.width;
CGFloat viewRatio = frameSize.width / frameSize.height;
CGFloat xc = .5f;
CGFloat yc = .5f;
if ( [[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspect] ) {
if (viewRatio > apertureRatio) {
CGFloat y2 = frameSize.height;
CGFloat x2 = frameSize.height * apertureRatio;
CGFloat x1 = frameSize.width;
CGFloat blackBar = (x1 - x2) / 2;
if (point.x >= blackBar && point.x <= blackBar + x2) {
xc = point.y / y2;
yc = 1.f - ((point.x - blackBar) / x2);
}
} else {
CGFloat y2 = frameSize.width / apertureRatio;
CGFloat y1 = frameSize.height;
CGFloat x2 = frameSize.width;
CGFloat blackBar = (y1 - y2) / 2;
if (point.y >= blackBar && point.y <= blackBar + y2) {
xc = ((point.y - blackBar) / y2);
yc = 1.f - (point.x / x2);
}
}
} else if ([[videoPreviewLayer videoGravity] isEqualToString:AVLayerVideoGravityResizeAspectFill]) {
if (viewRatio > apertureRatio) {
CGFloat y2 = apertureSize.width * (frameSize.width / apertureSize.height);
xc = (point.y + ((y2 - frameSize.height) / 2.f)) / y2;
yc = (frameSize.width - point.x) / frameSize.width;
} else {
CGFloat x2 = apertureSize.height * (frameSize.height / apertureSize.width);
yc = 1.f - ((point.x + ((x2 - frameSize.width) / 2)) / x2);
xc = point.y / frameSize.height;
}
}
pointOfInterest = CGPointMake(xc, yc);
break;
}
}
}
return pointOfInterest;
}
-(CGPoint)将拓扑转换为FinterestfromViewCoordinates:(CGPoint)viewCoordinates
{
CGPoint pointOfInterest=CGPointMake(.5f、.5f);
CGSize frameSize=[[self videoPreviewView]frame].size;
AVCaptureVideoPreviewLayer*videoPreviewLayer=[self-prevLayer];
如果([[self-prevLayer]已镜像]){
viewCoordinates.x=frameSize.width-viewCoordinates.x;
}
如果([[videoPreviewLayer videoGravity]IsequalString:AVLayerVideoGravityResize]){
pointOfInterest=CGPointMake(viewcordinates.y/frameSize.height,1.f-(viewcordinates.x/frameSize.width));
}否则{
CGRect清洁孔径;
对于(AVCaptureInputPort*端口位于[[[self captureSession]inputs]lastObject]端口]){
if([port mediaType]==AVMediaTypeVideo){
cleanAperture=CMVideoFormatDescriptionGetCleanAperture([port formatDescription],是);
CGSize apertureSize=清洁孔径.size;
CGPoint=视图坐标;
CGFloat光圈比率=光圈大小.高度/光圈大小.宽度;
CGFloat viewRatio=frameSize.width/frameSize.height;
CGFloat xc=.5f;
cGyc=0.5f;
if([[videoPreviewLayer videoGravity]IseQualString:AVLayerVideoGravityResizeAspect]){
if(视场比率>光圈比率){
CGFloat y2=框架尺寸。高度;
CGFloat x2=框架尺寸。高度*孔径比;
CGFloat x1=frameSize.width;
CGFloat blackBar=(x1-x2)/2;
如果(点x>=黑条和点x=黑条和点y光圈比率){
CGFloat y2=孔径大小.width*(frameSize.width/孔径大小.height);
xc=(点y+((y2-frameSize.height)/2.f))/y2;
yc=(frameSize.width-point.x)/frameSize.width;
}否则{
CGFloat x2=孔径大小.height*(frameSize.height/孔径大小.width);
yc=1.f-((点x+((x2-帧大小.宽度)/2))/x2);
xc=点y/帧大小.高度;
}
}
兴趣点=CGPointMake(xc,yc);
打破
}
}
}
返回兴趣点;
}
我想为@Louis的答案提供一些补充信息
根据(请注意粗体部分):
此外,设备可能支持感兴趣的焦点。您可以使用focusPointOfInterestSupported测试支持。如果支持,您可以使用focusPointOfInterest设置焦点。您传递一个CGPoint,其中{0,0}表示图片区域的左上角,而{1,1}表示横向模式下的右下角,主页按钮位于右侧-即使设备处于纵向模式,此选项也适用。
我们应该在计算
FocusPointOfInterest
时涉及方向。我在iOS 5上使用了完全相同的代码,没有任何问题。你的错误可能在其他地方吗?重复:我还没有测试过这个函数,但我自己已经实现了几个类似的函数。看来你已经做了我没有做的额外工作,来计算各种gravitations等。干得好!这段代码来自苹果AVCam演示:苹果最新的API不允许你用一行代码来完成这一切吗?问题表明是iOS5,但由于这是一个我偶然发现的问题,我想我应该把这个参考留在这里。自iOS6以来,这方面的规则已经改变了: