Objective c CIDetector不是';t释放存储器
我多次使用CIDetector,如下所示:Objective c CIDetector不是';t释放存储器,objective-c,ios7,ciimage,Objective C,Ios7,Ciimage,我多次使用CIDetector,如下所示: -(NSArray *)detect:(UIImage *)inimage { UIImage *inputimage = inimage; UIImageOrientation exifOrientation = inimage.imageOrientation; NSNumber *orientation = [NSNumber numberWithInt:exifOrientatio
-(NSArray *)detect:(UIImage *)inimage
{
UIImage *inputimage = inimage;
UIImageOrientation exifOrientation = inimage.imageOrientation;
NSNumber *orientation = [NSNumber numberWithInt:exifOrientation];
NSDictionary *imageOptions = [NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation];
CIImage* ciimage = [CIImage imageWithCGImage:inputimage.CGImage options:imageOptions];
NSDictionary *detectorOptions = [NSDictionary dictionaryWithObject:orientation forKey:CIDetectorImageOrientation];
NSArray* features = [self.detector featuresInImage:ciimage options:detectorOptions];
if (features.count == 0)
{
PXLog(@"no face found");
}
ciimage = nil;
NSMutableArray *returnArray = [NSMutableArray new];
for(CIFaceFeature *feature in features)
{
CGRect rect = feature.bounds;
CGRect r = CGRectMake(rect.origin.x,inputimage.size.height - rect.origin.y - rect.size.height,rect.size.width,rect.size.height);
FaceFeatures * ff = [[FaceFeatures new] initWithLeftEye:CGPointMake(feature.leftEyePosition.x, inputimage.size.height - feature.leftEyePosition.y )
rightEye:CGPointMake(feature.rightEyePosition.x, inputimage.size.height - feature.rightEyePosition.y )
mouth:CGPointMake(feature.mouthPosition.x, inputimage.size.height - feature.mouthPosition.y )];
Face *ob = [[Face new] initFaceInRect:r withFaceFeatures:ff] ;
[returnArray addObject:ob];
}
features = nil;
return returnArray;
}
-(CIContext*) context{
if(!_context){
_context = [CIContext contextWithOptions:nil];
}
return _context;
}
-(CIDetector *)detector
{
if (!_detector)
{
// 1 for high 0 for low
#warning not checking for fast/slow detection operation
NSString *str = @"fast";//[SettingsFunctions retrieveFromUserDefaults:@"face_detection_accuracy"];
if ([str isEqualToString:@"slow"])
{
//DDLogInfo(@"faceDetection: -I- Setting accuracy to high");
_detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil
options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyHigh forKey:CIDetectorAccuracy]];
} else {
//DDLogInfo(@"faceDetection: -I- Setting accuracy to low");
_detector = [CIDetector detectorOfType:CIDetectorTypeFace context:nil
options:[NSDictionary dictionaryWithObject:CIDetectorAccuracyLow forKey:CIDetectorAccuracy]];
}
}
return _detector;
}
但在出现各种内存问题后,根据仪器,它看起来像是NSArray*features=[self.detector features图像:ciimage选项:detector选项]代码>未被发布
我的代码中有内存泄漏吗
我遇到了同样的问题,这似乎是重用CIDetector的一个bug(或者是出于缓存目的而设计的)
我没有重用CIDetector,而是根据需要实例化一个CIDetector,然后在检测完成后释放它(或者,用ARC术语来说,就是不保留引用),从而绕过了它。这样做会有一些成本,但是如果您像您所说的那样在后台线程上进行检测,那么与无限内存增长相比,这个成本可能是值得的
也许更好的解决方案是,如果您连续检测多个图像,创建一个检测器,将其用于所有图像(或者,如果增长过大,则可能每N个图像发布并创建一个新的检测器。您必须进行实验,以确定N应该是什么)
我已经向苹果公司提交了一个关于这个问题的雷达bug:我已经解决了这个问题,你应该在涉及检测方法的地方使用@autorelease,就像在swift中这样
autoreleasepool(invoking: {
let result = self.detect(image: image)
// do other things
})
是否从后台线程调用detect(),如从captureOutput()或类似的调用?请尝试使用@autorelease块包围检测器。我们在非UI线程上遇到了内存无法自动释放的问题,这为我们解决了许多问题。这对ARC也有效吗?你用@autorelease块包装了什么部分?在玩了几天之后()这可能是最好的也是唯一的选择,直到苹果不修复它(smh),但我想指出一点:cidetector必须在detect函数中——苹果说最好的方法是创建一个实例,然后全部重用它——bulls***和异步调用也可以。在我的例子中,没有autoreleasepool的内存从50分钟变为140mb,并保持在那里,而autoreleasepool的内存从50分钟变为75-79mb,不管有多少图片