更新到iOS 10.2后,在几帧后无法在GLKView上绘制CIImage?

更新到iOS 10.2后,在几帧后无法在GLKView上绘制CIImage?,ios,objective-c,avfoundation,core-image,ios10.2,Ios,Objective C,Avfoundation,Core Image,Ios10.2,在我的应用程序中使用以下代码,在使用iOS之前,该应用程序一直在执行静默精细操作,以在GLKView上一次又一次地绘制CIImage,从AVCaptureOutput-didOutputSampleBuffer接收,直到我使用iOS您的GLKView渲染实现看起来很好,问题似乎来自将PixelBuffer转换为CIImage后对其进行的处理量 另外,您共享的Imgur链接显示GLKView无法正确准备VideoTexture对象,很可能是由于每次迭代中创建的内存过载。您需要优化此CIFilter

在我的应用程序中使用以下代码,在使用iOS之前,该应用程序一直在执行静默精细操作,以在GLKView上一次又一次地绘制CIImage,从AVCaptureOutput
-didOutputSampleBuffer
接收,直到我使用iOS您的GLKView渲染实现看起来很好,问题似乎来自将PixelBuffer转换为CIImage后对其进行的处理量


另外,您共享的Imgur链接显示GLKView无法正确准备VideoTexture对象,很可能是由于每次迭代中创建的内存过载。您需要优化此CIFilter处理。

如果绕过与CIFilter相关的处理,会发生什么情况?好的,我将pixelbuffer原样传递给glkview,它没有崩溃或产生任何效果。但为什么会这样?这是否意味着我永远不能使用某些过滤器。我查看了CIFunhouse,他们正在实施几乎所有的核心图像过滤器,但。。。。。困惑。请检查PixelBuffer的分辨率,如果它与CIFunhouse中的分辨率相同,则无法以每秒24或30帧的速度应用每个CIFILTER上的“更多”,因此不适用于更高分辨率的视频,特别是在去年或去年的第二台iOS设备上。我如何才能使所有过滤器处于工作状态。我应该缩小比例还是使用低预设?
[_glkView bindDrawable];  

if (self.eaglContext != [EAGLContext currentContext])  
[EAGLContext setCurrentContext:self.eaglContext];  

glClearColor(0.0, 0.0, 0.0, 1.0);  
glClear(GL_COLOR_BUFFER_BIT);  

glEnable(GL_BLEND);  
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);  

if (ciImage) {  
    [_ciContext drawImage:ciImage inRect:gvRect fromRect:dRect];  
}  

[_glkView display];  
- (CIImage*)ciImageFromPixelBuffer:(CVPixelBufferRef)pixelBuffer ofSampleBuffer:(CMSampleBufferRef)sampleBuffer {
CIImage *croppedImage           = nil;

CFDictionaryRef attachments     = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, sampleBuffer, kCMAttachmentMode_ShouldPropagate);
CIImage *ciImage                = [CIImage imageWithCVPixelBuffer:pixelBuffer options:(NSDictionary *)attachments];

if (attachments)
    CFRelease(attachments);

croppedImage = ciImage;


    CIFilter *scaleFilter = [CIFilter filterWithName:@"CILanczosScaleTransform"];
    [scaleFilter setValue:croppedImage forKey:@"inputImage"];
    [scaleFilter setValue:[NSNumber numberWithFloat:self.zoom_Resize_Factor == 1 ? 0.25 : 0.5] forKey:@"inputScale"];
    [scaleFilter setValue:[NSNumber numberWithFloat:1.0] forKey:@"inputAspectRatio"];
    croppedImage = [scaleFilter valueForKey:@"outputImage"];


    NSDictionary *options = @{(id)kCIImageAutoAdjustRedEye : @(false)};

    NSArray *adjustments = [ciImage autoAdjustmentFiltersWithOptions:options];
    for (CIFilter *filter in adjustments) {
        [filter setValue:croppedImage forKey:kCIInputImageKey];
        croppedImage = filter.outputImage;
    }

CIFilter *selectedFilter = [VideoFilterFactory getFilterWithType:self.selectedFilterType]; //This line needs to be removed from here

croppedImage = [VideoFilterFactory applyFilter:selectedFilter OnImage:croppedImage];

CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

return croppedImage;
}