Iphone 使用imageWithData从另一个UIImage创建UIImage将返回nil

Iphone 使用imageWithData从另一个UIImage创建UIImage将返回nil,iphone,uiimage,Iphone,Uiimage,我试图从一个uiimage中检索图像数据,修改它,并从中创建一个新的uiimage。首先,我尝试在不做任何修改的情况下复制数据,以了解基本情况,但失败了(UIImage+imageWithData:返回nil)。有人知道为什么这样不行吗 // I've confirmed that the follow line works UIImage *image = [UIImage imageNamed:@"image_foo.png"]; // Get a reference to the da

我试图从一个uiimage中检索图像数据,修改它,并从中创建一个新的uiimage。首先,我尝试在不做任何修改的情况下复制数据,以了解基本情况,但失败了(
UIImage+imageWithData:
返回nil)。有人知道为什么这样不行吗

// I've confirmed that the follow line works

UIImage *image = [UIImage imageNamed:@"image_foo.png"];

// Get a reference to the data, appears to work

CFDataRef dataRef = CGDataProviderCopyData(CGImageGetDataProvider([image CGImage]));

// Get the length of memory to allocate, seems to work (e.g. 190000)

int length = CFDataGetLength(dataRef) * sizeof(UInt8);

UInt8 * buff = malloc(length);

// Again, appears to work 

CFDataGetBytes(dataRef, CFRangeMake(0,length),buff);

// Again, appears to work

NSData * newData = [NSData dataWithBytesNoCopy:buff length:length];

// This fails by returning nil

UIImage *image2 = [UIImage imageWithData:newData]; 
注:

我还尝试使用:

UInt8*data=CFDataGetBytePtr(dataRef)

然后直接将这些数据传输到NSData中


同样的结果

我相信imageWithData获取的是图像文件数据,而不是图像显示数据,这是您传递给它的。试着这样做:

NSData * newData = UIImageJPEGRepresentation(image, 1.0);
UIImage *image2 = [UIImage imageWithData:newData]; 
    // Create a color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
    {
        fprintf(stderr, "Error allocating color space\n");
        return nil;
    }

    CGContextRef context = CGBitmapContextCreate (bits, size.width, size.height,
            8, size.width * 4, colorSpace,
            kCGImageAlphaPremultipliedLast | IMAGE_BYTE_ORDER
            );
    CGColorSpaceRelease(colorSpace );

    if (context == NULL)
    {
        fprintf (stderr, "Error: Context not created!");
        return nil;
    }

    CGImageRef ref = CGBitmapContextCreateImage(context);
    //free(CGBitmapContextGetData(context));                                      //* this appears to free bits -- probably not mine to free!
    CGContextRelease(context);

    UIImage *img = [UIImage imageWithCGImage:ref];
    CFRelease(ref);                                                             //* ?!?! Bug in 3.0 simulator.  Run in 3.1 or higher.

    return img;
UIImageJPegRepresentation()
返回要写入文件以在磁盘上创建.jpg文件的数据。这一点,我99.44%肯定是
imageWithData:
想要的

注意:如果您希望在创建image2之前处理数据,那么您确实需要显示数据,在这种情况下,从中获取图像的方式有点复杂,但看起来如下所示:

NSData * newData = UIImageJPEGRepresentation(image, 1.0);
UIImage *image2 = [UIImage imageWithData:newData]; 
    // Create a color space
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    if (colorSpace == NULL)
    {
        fprintf(stderr, "Error allocating color space\n");
        return nil;
    }

    CGContextRef context = CGBitmapContextCreate (bits, size.width, size.height,
            8, size.width * 4, colorSpace,
            kCGImageAlphaPremultipliedLast | IMAGE_BYTE_ORDER
            );
    CGColorSpaceRelease(colorSpace );

    if (context == NULL)
    {
        fprintf (stderr, "Error: Context not created!");
        return nil;
    }

    CGImageRef ref = CGBitmapContextCreateImage(context);
    //free(CGBitmapContextGetData(context));                                      //* this appears to free bits -- probably not mine to free!
    CGContextRelease(context);

    UIImage *img = [UIImage imageWithCGImage:ref];
    CFRelease(ref);                                                             //* ?!?! Bug in 3.0 simulator.  Run in 3.1 or higher.

    return img;

(上面的代码来自Erica Sadun的示例。智能部分是她的;错误都是我的。但这是一般的想法,应该是可行的。)

为什么使用dataWithBytesNoCopy?之所以使用dataWithBytesNoCopy,是因为他自己使用malloc分配了缓冲区。这样,在该对象的整个生命周期内,他将能够访问image2的实际图像缓冲区。