Warning: file_get_contents(/data/phpspider/zhask/data//catemap/6/jenkins/5.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Ios 使用AVCaptureVideoDataOutputSampleBufferDelegate方法执行dispatch_异步块时出现延迟_Ios_Avfoundation_Grand Central Dispatch - Fatal编程技术网

Ios 使用AVCaptureVideoDataOutputSampleBufferDelegate方法执行dispatch_异步块时出现延迟

Ios 使用AVCaptureVideoDataOutputSampleBufferDelegate方法执行dispatch_异步块时出现延迟,ios,avfoundation,grand-central-dispatch,Ios,Avfoundation,Grand Central Dispatch,我目前正在从事一个项目,涉及闪烁检测的AVCaptureVideoDataOutputSampleBufferDelegate 我在委托方法中有以下dispatch\u async块 (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{

我目前正在从事一个项目,涉及闪烁检测的
AVCaptureVideoDataOutputSampleBufferDelegate

我在委托方法中有以下
dispatch\u async

(void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{

//Initialisation of buffer and UIImage and CIDetector, etc.

    dispatch_async(dispatch_get_main_queue(), ^(void) {
        if(features.count > 0){
            CIFaceFeature *feature = [features objectAtIndex:0];
            if([feature leftEyeClosed]&&[feature rightEyeClosed]){
                flag = TRUE;
            }else{
                if(flag){
                    blinkcount++;
                    //Update UILabel containing blink count. The count variable is incremented from here.
                }
            flag = FALSE;
            }
    }
}
上面显示的方法会被连续调用并处理来自摄像机的视频馈送。
标志
布尔值用于跟踪眼睛在最后一帧是闭着还是睁开,以便检测到眨眼。有大量的帧被丢弃,但仍然正确地检测到闪烁,所以我想处理的fps是足够的


我的问题是
UILabel
在执行闪烁后经过相当长的延迟(~1秒)后得到更新。这使得该应用程序看起来滞后且不直观。我尝试在没有调度的情况下编写UI更新代码,但这是不可能的。我是否可以做些什么,使
UILabel
在执行闪烁后立即得到更新?

如果没有更多的代码,很难确切知道这里发生了什么,但在调度代码之上,您说:

//Initialisation of buffer and UIImage and CIDetector, etc.
如果你真的每次都初始化检测器,那可能是次优的——让它长寿。我不确定初始化CIDetector是否昂贵,但这是一个开始。如果你真的在这里使用UIImage,这也是次优的。不要通过UIImage,走更直接的路线:

CVImageBufferRef ib = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage* ciImage = [CIImage imageWithCVPixelBuffer: ib];
NSArray* features = [longLivedDetector featuresInImage: ciImage];
最后,在后台线程上进行特征检测,只将UILabel更新封送回主线程。像这样:

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (!_longLivedDetector) {
        _longLivedDetector = [CIDetector detectorOfType:CIDetectorTypeFace context: ciContext options: whatever];
    }

    CVImageBufferRef ib = CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage* ciImage = [CIImage imageWithCVPixelBuffer: ib];
    NSArray* features = [_longLivedDetector featuresInImage: ciImage];
    if (!features.count)
        return;

    CIFaceFeature *feature = [features objectAtIndex:0];
    const BOOL leftAndRightClosed = [feature leftEyeClosed] && [feature rightEyeClosed];

    // Only trivial work is left to do on the main thread.
    dispatch_async(dispatch_get_main_queue(), ^(void){
        if (leftAndRightClosed) {
            flag = TRUE;
        } else {
            if (flag) {
                blinkcount++;
                //Update UILabel containing blink count. The count variable is incremented from here.
            }
            flag = FALSE;
        }
    });
}
最后,您还应该记住,面部特征检测是一项非常重要的信号处理任务,它需要大量的计算(即时间)才能完成。我预计,如果没有更快的硬件,就没有办法让它更快