Ios 我想在AVCapture框架中限制视频捕获帧速率

Ios 我想在AVCapture框架中限制视频捕获帧速率,ios,accessibility,avcapturesession,Ios,Accessibility,Avcapturesession,我正试图限制我的应用程序的视频捕获帧速率,因为我发现它会影响VoiceOver性能 目前,它从摄像机捕获帧,然后使用OpenGL例程尽快处理帧。我想在捕获过程中设置一个特定的帧率 我本来希望通过使用videoMinFrameDuration或minFrameDuration能够做到这一点,但这似乎对性能没有影响。有什么想法吗 NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo]; for (A

我正试图限制我的应用程序的视频捕获帧速率,因为我发现它会影响VoiceOver性能

目前,它从摄像机捕获帧,然后使用OpenGL例程尽快处理帧。我想在捕获过程中设置一个特定的帧率

我本来希望通过使用videoMinFrameDuration或minFrameDuration能够做到这一点,但这似乎对性能没有影响。有什么想法吗

    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) 
{
    if ([device position] == AVCaptureDevicePositionBack) 
    {
        backFacingCamera = device;
                    //  SET SOME OTHER PROPERTIES
    }
}


// Create the capture session
captureSession = [[AVCaptureSession alloc] init];

// Add the video input  
NSError *error = nil;
videoInput = [[[AVCaptureDeviceInput alloc] initWithDevice:backFacingCamera error:&error] autorelease];

// Add the video frame output   
videoOutput = [[AVCaptureVideoDataOutput alloc] init];
[videoOutput setAlwaysDiscardsLateVideoFrames:YES];

[videoOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(id)kCVPixelBufferPixelFormatTypeKey]];
[videoOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];



// Start capturing
if([backFacingCamera supportsAVCaptureSessionPreset:AVCaptureSessionPreset1920x1080])
{
    [captureSession setSessionPreset:AVCaptureSessionPreset1920x1080]; 
    captureDeviceWidth = 1920; 
    captureDeviceHeight = 1080;
    #if defined(VA_DEBUG)
    NSLog(@"Video AVCaptureSessionPreset1920x1080");
    #endif
}
else  do some fall back stuff

// If you wish to cap the frame rate to a known value, such as 15 fps, set 
// minFrameDuration.
AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.supportsVideoMinFrameDuration)
    conn.videoMinFrameDuration = CMTimeMake(1,2);
else
    videoOutput.minFrameDuration = CMTimeMake(1,2);


if ([captureSession canAddInput:videoInput]) 
    [captureSession addInput:videoInput];


if ([captureSession canAddOutput:videoOutput])
    [captureSession addOutput:videoOutput];

if (![captureSession isRunning])
    [captureSession startRunning];
有什么想法吗?我错过什么了吗?这是最好的节流方式吗

AVCaptureConnection *conn = [videoOutput connectionWithMediaType:AVMediaTypeVideo];
if (conn.supportsVideoMinFrameDuration)
    conn.videoMinFrameDuration = CMTimeMake(1,2);
else
    videoOutput.minFrameDuration = CMTimeMake(1,2);

事实证明,您需要同时设置videoMinFrameDuration和videoMaxFrameDuration才能使其中一个正常工作

例如:


Mike Ullrich的回答一直持续到ios 7。不幸的是,这两种方法在ios7中被弃用。您必须在AVCaptureDevice本身上设置
activeVideo{Min | Max}帧持续时间
。比如:

int fps                             = 30;  // Change this value
AVCaptureDevice *device             = ...; // Get the active capture device
[device lockForConfiguration:nil];
[device setActiveVideoMinFrameDuration:CMTimeMake(1, fps)];
[device setActiveVideoMaxFrameDuration:CMTimeMake(1, fps)];
[device unlockForConfiguration];

我在最初的信息中不清楚。minFrameDuration方法似乎没有任何作用,它没有降低帧率。我正在使用ios 5.0 SDK针对ios 4.3目标进行构建。
int fps                             = 30;  // Change this value
AVCaptureDevice *device             = ...; // Get the active capture device
[device lockForConfiguration:nil];
[device setActiveVideoMinFrameDuration:CMTimeMake(1, fps)];
[device setActiveVideoMaxFrameDuration:CMTimeMake(1, fps)];
[device unlockForConfiguration];