Ios AVCaptureStillImageOutput.pngStillImageNSDataRepresentation?
我第一次使用AVCaptureStillImageOutput,我在某个时候保存了一幅JPEG图像。 我想保存一个PNG图像,而不是JPEG图像。我需要为此做些什么 我在应用程序中有3行代码:Ios AVCaptureStillImageOutput.pngStillImageNSDataRepresentation?,ios,swift,avcapturesession,avcapture,avcaptureoutput,Ios,Swift,Avcapturesession,Avcapture,Avcaptureoutput,我第一次使用AVCaptureStillImageOutput,我在某个时候保存了一幅JPEG图像。 我想保存一个PNG图像,而不是JPEG图像。我需要为此做些什么 我在应用程序中有3行代码: let stillImageOutput = AVCaptureStillImageOutput() stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG] let imageData = AVCaptureStillIma
let stillImageOutput = AVCaptureStillImageOutput()
stillImageOutput.outputSettings = [AVVideoCodecKey:AVVideoCodecJPEG]
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
有没有一个简单的方法来修改这些行以得到我想要的?
浏览完网络后,似乎答案是否定的(除非我不够幸运),但我仍然相信一定有一些好的解决方案。中有示例代码显示如何将CMSampleBuffer转换为UIImage(在将CMSampleBuffer转换为UIImage对象下)。从那里,您可以使用
UIImagePNGRepresentation(image)
将其编码为PNG数据
下面是该代码的快速翻译:
extension UIImage
{
// Translated from <https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/AVFoundationPG/Articles/06_MediaRepresentations.html#//apple_ref/doc/uid/TP40010188-CH2-SW4>
convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
{
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
if CVPixelBufferLockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) != kCVReturnSuccess { return nil }
defer { CVPixelBufferUnlockBaseAddress(imageBuffer, kCVPixelBufferLock_ReadOnly) }
let context = CGBitmapContextCreate(
CVPixelBufferGetBaseAddress(imageBuffer),
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer),
8,
CVPixelBufferGetBytesPerRow(imageBuffer),
CGColorSpaceCreateDeviceRGB(),
CGBitmapInfo.ByteOrder32Little.rawValue | CGImageAlphaInfo.PremultipliedFirst.rawValue)
guard let quartzImage = CGBitmapContextCreateImage(context) else { return nil }
self.init(CGImage: quartzImage)
}
}
扩展UIImage
{
//翻译自
便利初始化?(来自sampleBuffer sampleBuffer:CMSampleBuffer)
{
guard let imageBuffer:CVPixelBuffer=CMSampleBufferGetImageBuffer(sampleBuffer)else{return nil}
如果CVPixelBufferLockBaseAddress(imageBuffer,kCVPixelBufferLock_ReadOnly)!=kCVReturnSuccess{return nil}
延迟{CVPixelBufferUnlockBaseAddress(imageBuffer,kCVPixelBufferLock_ReadOnly)}
让context=CGBitmapContextCreate(
CVPixelBufferGetBaseAddress(imageBuffer),
CVPixelBufferGetWidth(imageBuffer),
CVPixelBufferGetHeight(imageBuffer),
8.
CVPixelBufferGetBytesPerRow(图像缓冲区),
CGColorSpaceCreateDeviceRGB(),
CGBitmapInfo.ByteOrder32Little.rawValue | CGImageAlphaInfo.PremultipledFirst.rawValue)
guard let quartzImage=CGBitmapContextCreateImage(上下文)else{return nil}
self.init(CGImage:quartzImage)
}
}
这是上述代码的Swift 4版本
extension UIImage
{
convenience init?(fromSampleBuffer sampleBuffer: CMSampleBuffer)
{
guard let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return nil }
if CVPixelBufferLockBaseAddress(imageBuffer, .readOnly) != kCVReturnSuccess { return nil }
defer { CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly) }
let context = CGContext(
data: CVPixelBufferGetBaseAddress(imageBuffer),
width: CVPixelBufferGetWidth(imageBuffer),
height: CVPixelBufferGetHeight(imageBuffer),
bitsPerComponent: 8,
bytesPerRow: CVPixelBufferGetBytesPerRow(imageBuffer),
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
guard let quartzImage = context!.makeImage() else { return nil }
self.init(cgImage: quartzImage)
}
}
谢谢,我搜索得越多,似乎没有更简单的方法了。“我将试着从这一点出发,看看。”米歇尔,我刚刚用苹果示例代码的快速翻译更新了我的答案。试试看,让我知道它是否有效!