如何将AVFoundations captureStillImageAsynchronouslyFromConnection中的TIFF照片保存到iPhone(iOS)上具有EXIF元数据的文件?
有了这个问题,我只想知道我在没有外部库的情况下使用Xcode和iOS的可能性。我已经在探索在另一个应用程序中使用如何将AVFoundations captureStillImageAsynchronouslyFromConnection中的TIFF照片保存到iPhone(iOS)上具有EXIF元数据的文件?,avfoundation,tiff,exif,nsdocumentdirectory,Avfoundation,Tiff,Exif,Nsdocumentdirectory,有了这个问题,我只想知道我在没有外部库的情况下使用Xcode和iOS的可能性。我已经在探索在另一个应用程序中使用libtiff的可能性 问题 几个星期以来,我一直在筛选堆栈溢出,并为我的每一个问题单独找到了有效的解决方案。我有4件事需要做: 我需要RGBA数据,因为它来自摄像机,没有任何压缩 我需要尽可能多的元数据,尤其是EXIF 我需要保存在TIFF格式与其他软件兼容和无损 我需要保存在文件中,而不是照片库中,以防止随意查看 我可以通过使用JPEG获得2和4。 我可以有1,3和4的原始数据(分
libtiff
的可能性
问题
几个星期以来,我一直在筛选堆栈溢出,并为我的每一个问题单独找到了有效的解决方案。我有4件事需要做:
- (这让我希望我能使用JPEG格式,做苹果喜欢做的事情是如此轻松)
- (这将是使用的要点,例如)
- (也适用于kUTTypeTIFF,但不适用于元数据)
captureStillImageAsynchronouslyFromConnection
中操作的完整序列:
[[self myAVCaptureStillImageOutput] captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error)
{
//get all the metadata in the image
CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault, imageSampleBuffer, kCMAttachmentMode_ShouldPropagate);
// get image reference
CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(imageSampleBuffer);
// >>>>>>>>>> lock buffer address
CVPixelBufferLockBaseAddress(imageBuffer, 0);
//Get information about the image
uint8_t *baseAddress = (uint8_t *)CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
size_t width = CVPixelBufferGetWidth(imageBuffer);
size_t height = CVPixelBufferGetHeight(imageBuffer);
// create suitable color space
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
//Create suitable context (suitable for camera output setting kCVPixelFormatType_32BGRA)
CGContextRef newContext = CGBitmapContextCreate(baseAddress, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little | kCGImageAlphaPremultipliedFirst);
// <<<<<<<<<< unlock buffer address
CVPixelBufferUnlockBaseAddress(imageBuffer, 0);
// release color space
CGColorSpaceRelease(colorSpace);
//Create a CGImageRef from the CVImageBufferRef
CGImageRef newImage = CGBitmapContextCreateImage(newContext);
// release context
CGContextRelease(newContext);
// create destination and write image with metadata
CFURLRef url = (__bridge CFURLRef)[NSURL fileURLWithPath:filePath isDirectory:NO];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL(url, kUTTypeTIFF, 1, NULL);
CGImageDestinationAddImage(destination, imageRef, metadata);
// finalize and release destination
CGImageDestinationFinalize(destination);
CFRelease(destination);
}
及
我得到了一个很好的名义上未压缩的图像,格式为名义上的TIFF格式,包含所有元数据。(它在其他系统上镜像,但现在我可以编写EXIF和其他元数据,我相信我也可以对其进行微调)
再次感谢他的帮助 既然您已经破解了1、3和4,那么您缺少的唯一障碍似乎就是将数据和元数据一起保存。尝试此操作(假设未处理的数据位于名为
myImageDataSampleBuffer
的CMSampleBufferRef
中,并且您已经完成了将图形数据放入名为myImage
的CGImageRef
中的繁重工作):
这个线程对解决一个非常类似的问题非常有帮助,所以我想我应该提供一个解决方案的Swift 2.0实现,以防有人来看
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (imageDataSampleBuffer, error) -> Void in
// get image meta data (EXIF, etc)
let metaData: CFDictionaryRef? = CMCopyDictionaryOfAttachments( kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate )
// get reference to image
guard let imageBuffer = CMSampleBufferGetImageBuffer( imageDataSampleBuffer ) else { return }
// lock the buffer
CVPixelBufferLockBaseAddress( imageBuffer, 0 )
// read image properties
let baseAddress = CVPixelBufferGetBaseAddress( imageBuffer )
let bytesPerRow = CVPixelBufferGetBytesPerRow( imageBuffer )
let width = CVPixelBufferGetWidth( imageBuffer )
let height = CVPixelBufferGetHeight( imageBuffer )
// color space
let colorSpace = CGColorSpaceCreateDeviceRGB()
// context - camera output settings kCVPixelFormatType_32BGRA
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue).union(.ByteOrder32Little)
let newContext = CGBitmapContextCreate( baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue )
//unlock buffer
CVPixelBufferUnlockBaseAddress( imageBuffer, 0 )
//Create a CGImageRef from the CVImageBufferRef
guard let newImage = CGBitmapContextCreateImage( newContext ) else {
return
}
// create tmp file and write image with metadata
let fileName = String(format: "%@_%@", NSProcessInfo.processInfo().globallyUniqueString, "cap.tiff")
let fileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).URLByAppendingPathComponent(fileName)
if let destination = CGImageDestinationCreateWithURL( fileURL, kUTTypeTIFF, 1, nil) {
CGImageDestinationAddImage( destination, newImage, metaData )
let wrote = CGImageDestinationFinalize( destination )
if !wrote || NSFileManager.defaultManager().fileExistsAtPath(fileURL.URLString) {
return
}
}
}
p、 要使其工作,您必须如下配置图像缓冲区:
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA) ]
非常感谢你!我会尽快尝试:)我不会忘记的。我还没有时间尝试,因为我现在正忙于我的软件中的另一个问题。答应我!请再看一看。如何读回这些文件(需要在
UICollectionView
中显示缩略图)?
CFDictionaryRef metadata = CMCopyDictionaryOfAttachments(kCFAllocatorDefault,
myImageDataSampleBuffer,
kCMAttachmentMode_ShouldPropagate);
NSFileManager* fm = [[NSFileManager alloc] init];
NSURL* pathUrl = [fm URLForDirectory:saveDir
inDomain:NSUserDomainMask
appropriateForURL:nil
create:YES
error:nil];
NSURL* saveUrl = [pathUrl URLByAppendingPathComponent:@"myfilename.tif"];
CGImageDestinationRef destination = CGImageDestinationCreateWithURL((__bridge CFURLRef)saveUrl,
(CFStringRef)@"public.tiff", 1, NULL);
CGImageDestinationAddImage(destination, myImage, metadata);
CGImageDestinationFinalize(destination);
CFRelease(destination);
stillImageOutput?.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (imageDataSampleBuffer, error) -> Void in
// get image meta data (EXIF, etc)
let metaData: CFDictionaryRef? = CMCopyDictionaryOfAttachments( kCFAllocatorDefault, imageDataSampleBuffer, kCMAttachmentMode_ShouldPropagate )
// get reference to image
guard let imageBuffer = CMSampleBufferGetImageBuffer( imageDataSampleBuffer ) else { return }
// lock the buffer
CVPixelBufferLockBaseAddress( imageBuffer, 0 )
// read image properties
let baseAddress = CVPixelBufferGetBaseAddress( imageBuffer )
let bytesPerRow = CVPixelBufferGetBytesPerRow( imageBuffer )
let width = CVPixelBufferGetWidth( imageBuffer )
let height = CVPixelBufferGetHeight( imageBuffer )
// color space
let colorSpace = CGColorSpaceCreateDeviceRGB()
// context - camera output settings kCVPixelFormatType_32BGRA
let bitmapInfo = CGBitmapInfo(rawValue: CGImageAlphaInfo.PremultipliedFirst.rawValue).union(.ByteOrder32Little)
let newContext = CGBitmapContextCreate( baseAddress, width, height, 8, bytesPerRow, colorSpace, bitmapInfo.rawValue )
//unlock buffer
CVPixelBufferUnlockBaseAddress( imageBuffer, 0 )
//Create a CGImageRef from the CVImageBufferRef
guard let newImage = CGBitmapContextCreateImage( newContext ) else {
return
}
// create tmp file and write image with metadata
let fileName = String(format: "%@_%@", NSProcessInfo.processInfo().globallyUniqueString, "cap.tiff")
let fileURL = NSURL(fileURLWithPath: NSTemporaryDirectory()).URLByAppendingPathComponent(fileName)
if let destination = CGImageDestinationCreateWithURL( fileURL, kUTTypeTIFF, 1, nil) {
CGImageDestinationAddImage( destination, newImage, metaData )
let wrote = CGImageDestinationFinalize( destination )
if !wrote || NSFileManager.defaultManager().fileExistsAtPath(fileURL.URLString) {
return
}
}
}
stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput?.outputSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_32BGRA) ]