Warning: file_get_contents(/data/phpspider/zhask/data//catemap/1/cocoa/3.json): failed to open stream: No such file or directory in /data/phpspider/zhask/libs/function.php on line 167

Warning: Invalid argument supplied for foreach() in /data/phpspider/zhask/libs/tag.function.php on line 1116

Notice: Undefined index: in /data/phpspider/zhask/libs/function.php on line 180

Warning: array_chunk() expects parameter 1 to be array, null given in /data/phpspider/zhask/libs/function.php on line 181
Objective c 使用alpha将两个视频相互叠加合成_Objective C_Cocoa_Avfoundation_Avcomposition_Avvideocomposition - Fatal编程技术网

Objective c 使用alpha将两个视频相互叠加合成

Objective c 使用alpha将两个视频相互叠加合成,objective-c,cocoa,avfoundation,avcomposition,avvideocomposition,Objective C,Cocoa,Avfoundation,Avcomposition,Avvideocomposition,AVFoundation允许您将2个资源(2个视频)作为2个“曲目”进行“组合”,例如,就像Final Cut Pro中一样 理论上说我可以用alpha两个视频放在一起,然后两个都看 要么我做错了什么,要么某个地方有个bug,因为下面的测试代码虽然有点凌乱,但清楚地表明我应该看两个视频,而我只看到一个,如图所示:--“蓝色”方块是IMG_1388.m4v 无论出于何种原因,IMG_1383.MOV从未显示 NSDictionary *options = [NSDictionary diction

AVFoundation允许您将2个资源(2个视频)作为2个“曲目”进行“组合”,例如,就像Final Cut Pro中一样

理论上说我可以用alpha两个视频放在一起,然后两个都看

要么我做错了什么,要么某个地方有个bug,因为下面的测试代码虽然有点凌乱,但清楚地表明我应该看两个视频,而我只看到一个,如图所示:--“蓝色”方块是IMG_1388.m4v

无论出于何种原因,IMG_1383.MOV从未显示

NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

// Track B
NSURL *urlVideo2 = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/IMG_1388.m4v"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];

AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];

// Track A
NSURL *urlVideo = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/IMG_1383.MOV"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
[from setOpacity:.5 atTime:kCMTimeZero];

// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition,  nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(480, 360);


// Export
NSURL *outputURL = [NSURL URLWithString:@"file://localhost/Users/me/Movies/Temp/export.MOV"];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[[composition copy] autorelease] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:@"com.apple.quicktime-movie"];
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:nil];

// Player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
你看到什么不对劲了吗


此代码的“目标”是“记录”相机输入(视频1)和opengl输出(视频2)。我还试图用缓冲区等“直接”编写它们,但也没有成功:(事实证明,AVFoundation并没有我想象的那么简单。

我想你弄错了

一个视频文件可能有多个数据流。例如,如果是带有声音的视频文件,则该文件将有2个流,即音频流和视频流。另一个示例是音频环绕视频文件,其中可能包括5个或更多音频流和1个视频流

与音频一样,大多数视频文件容器格式(mov、mp4等)在一个文件中支持多个视频流,但事实上,这并不意味着这些流之间会有任何关系,它们只是存储在同一个文件容器中。例如,如果您使用QuickTime打开此类文件,您将在此类文件中获得与视频流一样多的窗口

无论如何,视频流不会以这种方式“混合”。 你们想要实现的是和视频流的信号处理有关的,我真的建议你们阅读更多关于它的内容


如果您真的不需要将视频数据“混合”到一个文件中,您可能希望使用MPMediaPlayer将两个视频文件彼此显示。请记住,处理视频数据通常是一个CPU密集型问题,您可能(有时)会遇到此问题使用now days iOS设备无法解决此问题。

除了以下部分外,它看起来不错:

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
您需要使用
videoTrack
videoTrack2
来构建层指令,即添加到
composition
的轨迹,而不是原始资源
videoAssetTrack
videoAssetTrack2

另外,添加一个变换来旋转视频,这有点棘手(就像AVFoundation中的任何超越基础的东西一样)。 我刚刚把这句话注释掉,让它播放这两个视频

这是经过修改的代码:

NSError *error = nil;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];

// Track B
NSURL *urlVideo2 = [[NSBundle mainBundle] URLForResource:@"b" withExtension:@"mov"];        
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];

AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
//[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];

// Track A
NSURL *urlVideo = [[NSBundle mainBundle] URLForResource:@"a" withExtension:@"mov"];        
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];

AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[from setOpacity:.5 atTime:kCMTimeZero];

// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition,  nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = composition.naturalSize; // CGSizeMake(480, 360);

我还不能让它工作(你的代码),但是…我觉得很奇怪,我不得不添加“videoTrack”,这是一个
AVMutableCompositionTrack*
,其中文档/标题清楚地说明了
-[videoCompositionLayerInstructionWithAssetTrack:(AVAssetTrack*)轨道
(因此,参数应为
AVAssetTrack*
my videoAssetTrack is。不过感谢您的帮助!最后,我终于按照AVMutableCompositionTrack的子类AVAssetTrack上的说明“一帧一帧地”合成了“两个视频”。您需要在合成中对曲目进行说明(videoTrack和videoTrack 2),否则对结果没有影响。正确。使用my:)videos运行代码会给我提供
-[AVPlayerItem setVideoComposition:]但视频合成必须具有正向渲染。您使用哪种代码“显示”AVPlayerLayer上的视频?播放器代码是标准的,与您的代码类似。您是否复制了最后一行:videoComposition.renderSize=composition.naturalSize;您是否能够制作两个视频相互叠加的视频?请告诉我们您为实现这一点做了哪些更改。他显然是在尝试将两个视频相互叠加nd将不透明度添加到1,这样您就可以看到底层视频。如果您建议构图只有一个视频曲目,如何将这两个曲目添加到作曲家中进行混合?Hello@StuFF mc我想将多个视频与过渡合并我使用了您的代码,但无法在文档目录中获取输出文件,我使用了djromero answe但是它也不起作用,请帮助我任何建议。