Ios CGContextDrawImage相机应用程序崩溃

Ios CGContextDrawImage相机应用程序崩溃,ios,memory,crash,avcapturesession,Ios,Memory,Crash,Avcapturesession,我正在尝试使用AVCaptureSession获取图像。我遵循了这个教程。我从image ref创建uiimage,然后从该uiimage获取像素。 但应用程序在一段时间后(不到30秒)崩溃。我试着用漏洞分析,结果也崩溃了。使用日志,我发现应用程序在CGContextDrawImage(context,rect,image1.CGImage)行之前崩溃; 你们对我可能做错的事有什么建议吗。在应用程序崩溃前几秒钟,我还看到内存分配错误。请帮忙 代码发布在下面 // Create a UIImage

我正在尝试使用AVCaptureSession获取图像。我遵循了这个教程。我从image ref创建uiimage,然后从该uiimage获取像素。 但应用程序在一段时间后(不到30秒)崩溃。我试着用漏洞分析,结果也崩溃了。使用日志,我发现应用程序在CGContextDrawImage(context,rect,image1.CGImage)行之前崩溃; 你们对我可能做错的事有什么建议吗。在应用程序崩溃前几秒钟,我还看到内存分配错误。请帮忙

代码发布在下面

// Create a UIImage from sample buffer data

- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer 
{
lock = @"YES";

 CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); 

// Lock the base address of the pixel buffer
CVPixelBufferLockBaseAddress(imageBuffer, 0); 

// Get the number of bytes per row for the pixel buffer
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer); 

// Get the number of bytes per row for the pixel buffer
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
// Get the pixel buffer width and height
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 

// Create a device-dependent RGB color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); 


size_t bufferSize = CVPixelBufferGetDataSize(imageBuffer);

// Create a Quartz direct-access data provider that uses data we supply.
NSData *data = [NSData dataWithBytes:baseAddress length:bufferSize];

CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData((__bridge CFDataRef)data);

CGImageRef quartzImage = CGImageCreate(width, height, 8, 32, bytesPerRow,
              colorSpace, kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
              dataProvider, NULL, true, kCGRenderingIntentDefault);

CGDataProviderRelease(dataProvider);

// Unlock the pixel buffer

CGColorSpaceRelease(colorSpace);

// Create an image object from the Quartz image
UIImage *image = [UIImage imageWithCGImage:quartzImage];

// Release the Quartz image
CGImageRelease(quartzImage);

CVPixelBufferUnlockBaseAddress(imageBuffer,0);
baseAddress = nil;
[data  release];
lock = @"NO";
return(image);
}

-(void)calculate
{
@try {

        UIImage *image1 = [self stillImage];   //Capture an image from the camera.
        //Extract the pixels from the camera image.

        CGColorSpaceRef colourSpace = CGColorSpaceCreateDeviceRGB();

        size_t bytesPerRow = image1.size.width*4;
        unsigned char* bitmapData = (unsigned char*)malloc(bytesPerRow*image1.size.height);

        CGContextRef context = CGBitmapContextCreate(bitmapData, image1.size.width, image1.size.height, 8, bytesPerRow,colourSpace,kCGImageAlphaPremultipliedFirst|kCGBitmapByteOrder32Big);

        CGColorSpaceRelease(colourSpace);

        CGContextDrawImage(context, rect, image1.CGImage);

        unsigned char* pixels = (unsigned char*)CGBitmapContextGetData(context);

        totalLuminance = 0.0;
        for(int p=0; p<image1.size.width*image1.size.height*4; p+=4)
        {
            totalLuminance += pixels[p]*0.3 + pixels[p+1]*0.59 + pixels[p+2]*0.11;
        }

        totalLuminance /= (image1.size.height * image1.size.width);                   

        pixels = nil;

        bitmapData = nil;

        [image1 release];

    CGContextRelease(context);
        //image1 = nil;

        //totalLuminance = [n floatValue];                   //Calculate the total luminance.
        float f = [del.camcont.slider value];
        float total = totalLuminance * f;
        NSString *ns = [NSString stringWithFormat:@"Lux : %0.2f", total];
        NSLog(@"slider = %f",f);
        NSLog(@"totlaluminance = %f",totalLuminance);
        NSLog(@"%@",ns);
        //NSString *ns = [NSString initWithFormat:@"Lux : %0.2f", total];
        [del.camcont.lux setText:ns];//Display the total luminance.

        self.stillImage = nil;
        //[self.stillImage release];
         ns = nil;
        //n = nil;
        //del = nil;
    }

    @catch (NSException *exception) {
        NSLog(@"main: Caught %@: %@", [exception name], [exception reason]);
}
}
//从样本缓冲区数据创建UIImage
-(UIImage*)imageFromSampleBuffer:(CMSampleBufferRef)sampleBuffer
{
lock=@“是”;
CVImageBufferRef imageBuffer=CMSampleBufferGetImageBuffer(sampleBuffer);
//锁定像素缓冲区的基址
CVPixelBufferLockBaseAddress(imageBuffer,0);
//获取像素缓冲区的每行字节数
void*baseAddress=CVPixelBufferGetBaseAddress(imageBuffer);
//获取像素缓冲区的每行字节数
size_t bytesPerRow=CVPixelBufferGetBytesPerRow(图像缓冲区);
//获取像素缓冲区的宽度和高度
size\u t width=CVPixelBufferGetWidth(imageBuffer);
大小\u t高度=CVPixelBufferGetHeight(imageBuffer);
//创建设备相关的RGB颜色空间
CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB();
size\u t bufferSize=CVPixelBufferGetDataSize(imageBuffer);
//创建使用我们提供的数据的Quartz直接访问数据提供程序。
NSData*data=[NSData dataWithBytes:基地址长度:缓冲区大小];
CGDataProviderRef dataProvider=CGDataProviderCreateWithCFData((_桥CFDataRef)数据);
CGImageRef quartzImage=CGImageCreate(宽、高、8、32、字节),
颜色空间,KCGIMAGEALPHANEONSKIPFIRST | KCGBITMAPBYTEORDER 32Little,
数据提供程序,NULL,true,kCGrenderingEntentDefault);
CGDataProviderRelease(数据提供者);
//解锁像素缓冲区
CGCOLORSPACTERELEASE(色彩空间);
//从石英图像创建图像对象
UIImage*image=[UIImage imageWithCGImage:quartzImage];
//释放石英图像
CGImageRelease(四倍图像);
CVPixelBufferUnlockBaseAddress(imageBuffer,0);
基地址=零;
[数据发布];
锁=@“否”;
返回(图像);
}
-(无效)计算
{
@试一试{
UIImage*image1=[self-stillImage];//从相机捕获图像。
//从相机图像中提取像素。
CGColorSpaceRef colorspace=CGColorSpaceCreateDeviceRGB();
size\u t bytesPerRow=image1.size.width*4;
无符号字符*位图数据=(无符号字符*)malloc(bytesPerRow*image1.size.height);
CGContextRef context=CGBitmapContextCreate(bitmapData、image1.size.width、image1.size.height、8、bytesPerRow、颜色空间、KCGimageAlphaPremultipledFirst | kCGBitmapByteOrder32Big);
CGCOLORSPACTERELEASE(色彩空间);
CGContextDrawImage(context,rect,image1.CGImage);
无符号字符*像素=(无符号字符*)CGBitmapContextGetData(上下文);
总照度=0.0;

对于(int p=0;p),内存管理看起来一开始就可以。作为一种解决方案,您可以考虑UIIMAGE。
以防自定义CGImageCreate代码出现问题。这是因为您正在使用CGImage创建UIImage。

我不清楚您为什么要使用
CMSampleBufferRef
然后创建
CGImageRef
,然后创建
UIImage
,然后使用
UIImage
CGImageRef
并吸出数据,然后将其放入
无符号字符
指针(该指针基本上指向
CMSampleBufferRef
中的相同字节)

如果您这样做,您将简化您的生活(并且您会发现调试更容易):

CVPixelBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); 
size_t width = CVPixelBufferGetWidth(imageBuffer); 
size_t height = CVPixelBufferGetHeight(imageBuffer); 
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
uint8_t *pixels = malloc(bytesPerRow*height);
memcpy(pixels, baseAddress, bytesPerRow*height);
baseAddress = nil;
imageBuffer = nil;
sampleBuffer = nil;
float totalLuminance = 0.0;
for(int r=0; r<height; r++)
{
   for(int p=0, p<width, p+=4)
   {
      totalLuminance += pixels[p+(r*bytesPerRow)]*0.3 
                     + pixels[p+1+(r*bytesPerRow)]*0.59 
                     + pixels[p+2+(r*bytesPerRow)]*0.11;
   {
}
free(pixels);
totalLuminance /= (width * height);
CVPixelBufferRef imageBuffer=CMSampleBufferGetImageBuffer(sampleBuffer);
CVPixelBufferLockBaseAddress(imageBuffer,0);
size_t bytesPerRow=CVPixelBufferGetBytesPerRow(图像缓冲区);
size\u t width=CVPixelBufferGetWidth(imageBuffer);
大小\u t高度=CVPixelBufferGetHeight(imageBuffer);
uint8_t*基地址=(uint8_t*)CVPixelBufferGetBaseAddress(imageBuffer);
uint8_t*像素=malloc(bytesPerRow*高度);
memcpy(像素、基址、字节数*高度);
基地址=零;
imageBuffer=nil;
sampleBuffer=nil;
浮动总照度=0.0;

对于(int r=0;rYou可能希望提供符号化的崩溃日志,以便我们可以查看崩溃的详细信息?请参阅@viks,您找到了解决方案吗?),我面临着同样的问题。您好,感谢您的回复。但是使用imageWithData无法正常工作。几秒钟后,在同一行中不断失败。值得一试。您能够发布崩溃日志吗?例如,如果你将此事件提交给Apple DTS,这是他们会要求的第一件事……但在Xcode organizer中找不到崩溃日志。我在每行之后添加了NSLog,发现每次应用程序在某个点关闭时,控制台中都没有任何崩溃日志或任何错误消息。谢谢你的回复。这是真的。我没有再次感谢你。这肯定会有助于我的应用程序的性能。顺便说一句,我不能对你的答案投赞成票,因为我没有足够的声誉。对不起