Ios 如何使用AVMutableCompositionTrack合并音频和视频
在我的应用程序中,我需要合并音频和视频,然后我需要在Mediaplayer中播放音频文件。如何在IOS中合并音频和视频。有没有这方面的源代码。请给我一些建议 提前感谢使用此Ios 如何使用AVMutableCompositionTrack合并音频和视频,ios,ios4,avmutablecomposition,Ios,Ios4,Avmutablecomposition,在我的应用程序中,我需要合并音频和视频,然后我需要在Mediaplayer中播放音频文件。如何在IOS中合并音频和视频。有没有这方面的源代码。请给我一些建议 提前感谢使用此 AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil]; AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil]; AVMu
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetHighestQuality];
NSString* videoName = @"export.mov";
NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath])
{
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
_assetExport.outputFileType = @"com.apple.quicktime-movie";
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
}
}
];
访问本教程以合并音频和视频文件您可以通过创建可变合成来合并视频
AVMutableComposition* composition = [[AVMutableComposition alloc]init];
AVURLAsset* video1 = [[AVURLAsset alloc]initWithURL:[NSURL fileURLWithPath:path1]options:nil];
NSArray *pathComponents = [NSArray arrayWithObjects:
[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],@"MyAudio.m4a",nil];
NSURL *outputFileURL = [NSURL fileURLWithPathComponents:pathComponents];
AVAsset *audioAsset = [AVAsset assetWithURL:outputFileURL];
//Create mutable composition of audio type
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,video1.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
AVMutableCompositionTrack* composedTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[composedTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, video1.duration)
ofTrack:[[video1 tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession*exporter = [[AVAssetExportSession alloc]initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
[exporter exportAsynchronouslyWithCompletionHandler:^{
case AVAssetExportSessionStatusFailed:
NSLog(@"Failed to export video");
break;
case AVAssetExportSessionStatusCancelled:
NSLog(@"export cancelled");
break;
}
有关视频合并,请访问本教程
您还可以找到合并视频的示例项目。现在回答有点晚,但它可以帮助将来的人。如果视频持续时间大于音频,则重复音频
+ (void)mergeVideoWithAudio:(NSURL *)videoUrl audioUrl:(NSURL *)audioUrl success:(void (^)(NSURL *url))success failure:(void (^)(NSError *error))failure {
AVMutableComposition *mixComposition = [AVMutableComposition new];
NSMutableArray<AVMutableCompositionTrack *> *mutableCompositionVideoTrack = [NSMutableArray new];
NSMutableArray<AVMutableCompositionTrack *> *mutableCompositionAudioTrack = [NSMutableArray new];
AVMutableVideoCompositionInstruction *totalVideoCompositionInstruction = [AVMutableVideoCompositionInstruction new];
AVAsset *aVideoAsset = [AVAsset assetWithURL:videoUrl];
AVAsset *aAudioAsset = [AVAsset assetWithURL:audioUrl];
AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
if (videoTrack && audioTrack) {
[mutableCompositionVideoTrack addObject:videoTrack];
[mutableCompositionAudioTrack addObject:audioTrack];
AVAssetTrack *aVideoAssetTrack = [aVideoAsset tracksWithMediaType:AVMediaTypeVideo].firstObject;
AVAssetTrack *aAudioAssetTrack = [aAudioAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
if (aVideoAssetTrack && aAudioAssetTrack) {
[mutableCompositionVideoTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration) ofTrack:aVideoAssetTrack atTime:kCMTimeZero error:nil];
CMTime videoDuration = aVideoAsset.duration;
if (CMTimeCompare(videoDuration, aAudioAsset.duration) == -1) {
[mutableCompositionAudioTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration) ofTrack:aAudioAssetTrack atTime:kCMTimeZero error:nil];
} else if (CMTimeCompare(videoDuration, aAudioAsset.duration) == 1) {
CMTime currentDuration = kCMTimeZero;
while (CMTimeCompare(currentDuration, videoDuration) == -1) {
// repeats audio
CMTime restTime = CMTimeSubtract(videoDuration, currentDuration);
CMTime maxTime = CMTimeMinimum(aAudioAsset.duration, restTime);
[mutableCompositionAudioTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero, maxTime) ofTrack:aAudioAssetTrack atTime:currentDuration error:nil];
currentDuration = CMTimeAdd(currentDuration, aAudioAsset.duration);
}
}
videoTrack.preferredTransform = aVideoAssetTrack.preferredTransform;
totalVideoCompositionInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, aVideoAssetTrack.timeRange.duration);
}
}
NSString *outputPath = [NSHomeDirectory() stringByAppendingPathComponent:@"tmp/screenCapture.mp4"];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputPath]) {
[[NSFileManager defaultManager] removeItemAtPath:outputPath error:nil];
}
NSURL *outputURL = [NSURL fileURLWithPath:outputPath];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
exportSession.outputURL = outputURL;
exportSession.outputFileType = AVFileTypeMPEG4;
exportSession.shouldOptimizeForNetworkUse = YES;
// try to export the file and handle the status cases
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status) {
case AVAssetExportSessionStatusFailed:
failure(exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
failure(exportSession.error);
break;
default:
success(outputURL);
break;
}
}];
}
+(void)合并视频与音频:(NSURL*)视频url音频url:(NSURL*)音频url成功:(void(^)(NSURL*url))成功失败:(void(^)(NSError*error))失败{
AVMutableComposition*mixComposition=[AVMutableComposition new];
NSMutableArray*mutableCompositionVideoTrack=[NSMutableArray新建];
NSMutableArray*mutableCompositionAudioTrack=[NSMutableArray new];
AVMutableVideoCompositionInstruction*totalVideoCompositionInstruction=[AVMutableVideoCompositionInstruction new];
AVAsset*aVideoAsset=[AVAsset AssetWithur:videoUrl];
AVAsset*aAudioAsset=[AVAsset assetwithur:audioUrl];
AVMutableCompositionTrack*videoTrack=[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID\u无效];
AVMutableCompositionTrack*audioTrack=[mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID\u无效];
if(视频曲目和音频曲目){
[mutableCompositionVideoTrack添加对象:videoTrack];
[mutableCompositionAudioTrack添加对象:audioTrack];
AVAssetTrack*AVIDEOASETTRACK=[AVIDEOASET tracksWithMediaType:AVMediaTypeVideo]。第一个对象;
AVAssetTrack*aAudioAssetTrack=[aAudioAsset tracksWithMediaType:AVMediaTypeAudio].firstObject;
if(aVideoAssetTrack&&aaAudioAssetTrack){
[mutableCompositionVideoTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero,AvideAssetTrack.timeRange.duration)跟踪:AvideAssetTrack时间:kCMTimeZero错误:零];
CMTime videoDuration=aVideoAsset.duration;
if(CMTimeCompare(videoDuration,aAudioAsset.duration)=-1){
[mutableCompositionAudioTrack.firstObject insertTimeRange:CMTimeRangeMake(kCMTimeZero,aVideoAssetTrack.timeRange.duration)跟踪:aaAudioAssetTrack时间:kCMTimeZero错误:零];
}else if(CMTimeCompare(videoDuration,aAudioAsset.duration)==1){
CMTime currentDuration=KCMTIME0;
同时(CMTimeCompare(currentDuration,videoDuration)=-1){
//重复音频
CMTime restTime=CMTimeSubtract(视频持续时间、当前持续时间);
CMTime maxTime=CMTimeMinimum(aaudioaset.duration,restTime);
[mutableCompositionAudioTrack.firstObject InsertTime Range:AcmTimeRangeMake(KCMTIME零,maxTime)的机架:aaudiasSetTrack atTime:currentDuration错误:nil];
currentDuration=CMTimeAdd(currentDuration,aAudioAsset.duration);
}
}
videoTrack.preferredTransform=AvideAssetTrack.preferredTransform;
totalVideoCompositionInstruction.timeRange=CMTimeRangeMake(kCMTimeZero,AvideAssetTrack.timeRange.duration);
}
}
NSString*outputPath=[NSHomeDirectory()stringByAppendingPathComponent:@“tmp/screenCapture.mp4”];
if([[NSFileManager defaultManager]fileExistsAtPath:outputPath]){
[[NSFileManager defaultManager]removeItemAtPath:outputPath错误:nil];
}
NSURL*outputURL=[NSURL fileURLWithPath:outputPath];
AVAssetExportSession*exportSession=[[AVAssetExportSession alloc]initWithAsset:mixComposition预设名称:AVAssetExportPresetHighestQuality];
exportSession.outputURL=outputURL;
exportSession.outputFileType=AVFileTypeMPEG4;
exportSession.shouldOptimizationForNetworkUse=是;
//尝试导出文件并处理状态案例
[exportSession exportAsynchronouslyWithCompletionHandler:^{
开关(exportSession.status){
案例AvassetExportSessionStatus失败:
失败(exportSession.error);
打破
案例AvassetExportSessionStatus取消:
失败(exportSession.error);
打破
违约:
成功(outputURL);
打破
}
}];
}
做一些研究-例如查看WWDC视频+示例代码。我已经尝试过此代码。它不工作,而且上面的编码中有一个bug。将initWithURL替换为URLAssetWithURLI将此代码与1s音频文件和3s视频文件一起使用。结果是一个视频,音频为1s,屏幕为黑屏,接着是2s的黑屏,接着是3s的视频。你知道是什么导致了这个问题吗?我只想将音频添加到视频的开头。我只合并一个音频/视频文件。问题是,我的视频文件是40秒,音频文件是28秒。所以对于剩下的12(40-28)秒,我想在音频文件中从0秒开始重复它。我该怎么做?有没有直接的方法可以做到这一点?@BrandonA:NSString*videoPath=[[NSBundle mainBundle]pathForResource:@“MOV”类型的“资源名称”;NSURL*videoURl=[NSURL fileURLWithPath:videoPath];请注意,不鼓励这样做,因此答案应该是搜索解决方案的终点(而不是参考文献的另一个中途停留,随着时间的推移,这些参考文献往往会过时)。请考虑在这里添加一个独立的概要,将链接作为参考。