Ios CIContext render:toCVPixelBuffer:bounds:colorSpace:函数不适用于具有alpha通道的图像

Ios CIContext render:toCVPixelBuffer:bounds:colorSpace:函数不适用于具有alpha通道的图像,ios,objective-c,core-image,cmsamplebuffer,Ios,Objective C,Core Image,Cmsamplebuffer,我试图在使用AVFoundation的AVCaptureVideoDataOutput录制的视频上添加水印/徽标。我遇到的问题是,UIImage的透明部分在写入视频后是黑色的。我做错了什么 CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer); 您可以合成将保持透明度的图像,并将其渲染到像素缓冲区。例如: CVPixelBufferRef pixelBuffer = CMSampleBufferGet

我试图在使用AVFoundation的AVCaptureVideoDataOutput录制的视频上添加水印/徽标。我遇到的问题是,UIImage的透明部分在写入视频后是黑色的。我做错了什么

CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);


您可以合成将保持透明度的图像,并将其渲染到像素缓冲区。例如:

    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage *cameraImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
    CGColorSpaceRef cSpace = CGColorSpaceCreateDeviceRGB();
    cameraImage = [self.logoImage imageByCompositingOverImage:cameraImage];
    [self.context render:cameraImage toCVPixelBuffer:pixelBuffer bounds:cameraImage.extent colorSpace:cSpace];

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    CGColorSpaceRelease(cSpace);

嗨,贝奥武夫谢谢你的回答,这对我有用。但是现在我有另一个问题,我在“AVCaptureVideoDataOutputSampleBufferDelegate-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)SampleBufferFromConnection:(AVCaptureConnection*)函数和“end capture”之后渲染每个CVPixelBufferRef的透明徽标我有音频问题。它与视频并不一致。
    CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CIImage *cameraImage = [[CIImage alloc] initWithCVPixelBuffer:pixelBuffer];
    CGColorSpaceRef cSpace = CGColorSpaceCreateDeviceRGB();
    cameraImage = [self.logoImage imageByCompositingOverImage:cameraImage];
    [self.context render:cameraImage toCVPixelBuffer:pixelBuffer bounds:cameraImage.extent colorSpace:cSpace];

    CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
    CGColorSpaceRelease(cSpace);