iOS制作2个透明视频

iOS制作2个透明视频,ios,transparency,alpha,avmutablecomposition,Ios,Transparency,Alpha,Avmutablecomposition,我有两个视频,其中一个是透明背景的叠加视频(我试图添加到视频中的爆炸视频)。当我尝试使用AVMutableComposition组合它们时,我看到覆盖的视频alpha通道被忽略。基本上我只看到第二个视频(黑色背景而不是透明的) 作为测试,我在叠加视频中添加了0.9的不透明度,以确保它们正确合并,结果我在叠加视频下看到了主视频(当然不是我想要的,但证明了合成是有效的)。关于如何使alpha频道在第二个视频中工作,有什么想法吗 NSError* error = nil; AVMut

我有两个视频,其中一个是透明背景的叠加视频(我试图添加到视频中的爆炸视频)。当我尝试使用AVMutableComposition组合它们时,我看到覆盖的视频alpha通道被忽略。基本上我只看到第二个视频(黑色背景而不是透明的)

作为测试,我在叠加视频中添加了0.9的不透明度,以确保它们正确合并,结果我在叠加视频下看到了主视频(当然不是我想要的,但证明了合成是有效的)。关于如何使alpha频道在第二个视频中工作,有什么想法吗

    NSError* error = nil;

    AVMutableComposition *comp = [AVMutableComposition composition];
    AVMutableCompositionTrack* videoCompTrack = [comp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0]; 
    AVMutableCompositionTrack* videoCompTrack2 = [comp addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1]; 
    AVMutableCompositionTrack* audioCompTrack = [comp addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

    // main video
    AVURLAsset* videoAssetMain = [AVURLAsset URLAssetWithURL:url1 options:nil];
    NSArray* tracks = [videoAssetMain tracksWithMediaType:AVMediaTypeVideo];
    AVAssetTrack* videoTrackMain = [tracks firstObject];
    tracks = [videoAssetMain tracksWithMediaType:AVMediaTypeAudio];
    AVAssetTrack* audioTrackMain = [tracks firstObject];
    CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, videoTrackMain.timeRange.duration);

    // overlay video with alpha channel
    AVURLAsset* videoAssetOver = [AVURLAsset URLAssetWithURL:url2 options:nil];
    tracks = [videoAssetOver tracksWithMediaType:AVMediaTypeVideo];
    AVAssetTrack* videoTrackOver = [tracks firstObject];

    [videoCompTrack insertTimeRange:timeRange ofTrack:videoTrackMain atTime:kCMTimeZero error:&error];
    [videoCompTrack2 insertTimeRange:timeRange ofTrack:videoTrackOver atTime:kCMTimeZero error:&error];

    [audioCompTrack insertTimeRange:timeRange ofTrack:audioTrackMain atTime:kCMTimeZero error:&error];

    AVMutableVideoCompositionLayerInstruction *inst1 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompTrack];
    [inst1 setOpacity:1 atTime:kCMTimeZero];
    AVMutableVideoCompositionLayerInstruction *inst2 = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoCompTrack2];
    // adding translucency to test that bottom video is there
    // [inst2 setOpacity:0.9 atTime:kCMTimeZero];
    // stretch overlay video on top of main video
    CGAffineTransform scale = CGAffineTransformMakeScale(videoTrackMain.naturalSize.width/videoTrackOver.naturalSize.width, videoTrackMain.naturalSize.height/videoTrackOver.naturalSize.height);
    [inst2 setTransform:scale atTime:kCMTimeZero];

    AVMutableVideoCompositionInstruction *trans = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    trans.backgroundColor = [UIColor clearColor].CGColor;
    trans.timeRange = timeRange;
    trans.layerInstructions = [NSArray arrayWithObjects:inst2,inst1, nil];

    AVMutableVideoComposition* videoComp = [AVMutableVideoComposition videoComposition];
    videoComp.instructions = [NSArray arrayWithObjects:trans,nil];
    videoComp.frameDuration = CMTimeMake(1, 30);
    videoComp.renderSize = comp.naturalSize;

    AVAssetExportSession* expSession = [[AVAssetExportSession alloc] initWithAsset:comp presetName:AVAssetExportPresetHighestQuality];
    NSString* newVideoPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"output_final.mov"];
    if ([[NSFileManager defaultManager] fileExistsAtPath:newVideoPath]) {
        [[NSFileManager defaultManager] removeItemAtPath:newVideoPath error:&error];
    }
    expSession.outputURL = [NSURL fileURLWithPath:newVideoPath];
    expSession.outputFileType = AVFileTypeQuickTimeMovie;
    expSession.videoComposition = videoComp;

    [expSession exportAsynchronouslyWithCompletionHandler:^{
        if (delegate) {
            [delegate videoProcessor:self didFinish:expSession.outputURL];
        }
    }];

有同样的问题,但没有找到任何方法为背景色提供BGRA颜色,因此没有找到解决方案,所以我使用我的方法在一个工作示例中进行了逐帧手动合并,可以在以下回答的注释中找到:解决方案有什么问题吗?你能解决这个问题吗?你能解决这个问题吗?
AVVideoCompositionInstruction


/* Indicates the background color of the composition. Solid BGRA colors only are supported; patterns and other color refs that are not supported will be ignored.
   If the background color is not specified the video compositor will use a default backgroundColor of opaque black.
   If the rendered pixel buffer does not have alpha, the alpha value of the backgroundColor will be ignored. */
@property (nonatomic, retain) __attribute__((NSObject)) CGColorRef backgroundColor;