如何在iphone SDK上向视频文件添加音频

如何在iphone SDK上向视频文件添加音频,iphone,audio,video-encoding,Iphone,Audio,Video Encoding,我有一个视频文件和一个音频文件。是否可以将其与声音文件合并为一个视频。我认为AVMutableComposition应该对我有所帮助,但我还是不明白怎么做。有什么建议吗?是的,有可能,这是一段代码,用于将音频添加到现有作品中,我从Apple示例代码中获取了这段代码,您可能应该查看整个项目,您会发现它非常有用,该项目是AVEditDemo,您可以在他们发布在developer.apple.com/videos/WWDC/2010的WWDC 2010材料中找到它。希望有帮助 - (void)add

我有一个视频文件和一个音频文件。是否可以将其与声音文件合并为一个视频。我认为AVMutableComposition应该对我有所帮助,但我还是不明白怎么做。有什么建议吗?

是的,有可能,这是一段代码,用于将音频添加到现有作品中,我从Apple示例代码中获取了这段代码,您可能应该查看整个项目,您会发现它非常有用,该项目是AVEditDemo,您可以在他们发布在developer.apple.com/videos/WWDC/2010的WWDC 2010材料中找到它。希望有帮助

 - (void)addCommentaryTrackToComposition:(AVMutableComposition *)composition withAudioMix:(AVMutableAudioMix *)audioMix

{

NSInteger i;

NSArray *tracksToDuck = [composition tracksWithMediaType:AVMediaTypeAudio]; // before we add the commentary



// Clip commentary duration to composition duration.

CMTimeRange commentaryTimeRange = CMTimeRangeMake(self.commentaryStartTime, self.commentary.duration);

if (CMTIME_COMPARE_INLINE(CMTimeRangeGetEnd(commentaryTimeRange), >, [composition duration]))

    commentaryTimeRange.duration = CMTimeSubtract([composition duration], commentaryTimeRange.start);



// Add the commentary track.

AVMutableCompositionTrack *compositionCommentaryTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];

[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, commentaryTimeRange.duration) ofTrack:[[self.commentary tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:commentaryTimeRange.start error:nil];





NSMutableArray *trackMixArray = [NSMutableArray array];

CMTime rampDuration = CMTimeMake(1, 2); // half-second ramps

for (i = 0; i < [tracksToDuck count]; i++) {

    AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck objectAtIndex:i]];

    [trackMix setVolumeRampFromStartVolume:1.0 toEndVolume:0.2 timeRange:CMTimeRangeMake(CMTimeSubtract(commentaryTimeRange.start, rampDuration), rampDuration)];

    [trackMix setVolumeRampFromStartVolume:0.2 toEndVolume:1.0 timeRange:CMTimeRangeMake(CMTimeRangeGetEnd(commentaryTimeRange), rampDuration)];

    [trackMixArray addObject:trackMix];

}

audioMix.inputParameters = trackMixArray;
-(void)添加评论跟踪到合成:(AVMUTABLE合成*)带audioMix的合成:(AVMUTABLE合成*)audioMix
{
恩森特格尔一世;
NSArray*tracksToDuck=[composition tracksWithMediaType:AVMediaTypeAudio];//在添加评论之前
//剪辑评论时长到作文时长。
CMTimeRange commentaryTimeRange=CMTimeRangeMake(self.commentaryStartTime,self.commentation.duration);
if(CMTIME\u COMPARE\u INLINE(CMTimeRangeGetEnd(commentaryTimeRange),>,[composition duration]))
commentaryTimeRange.duration=CMTimeSubtract([composition duration],commentaryTimeRange.start));
//添加评论曲目。
AVMutableCompositionTrack*compositionCommentaryTrack=[composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_无效];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,commentaryTimeRange.duration)跟踪:[[self.CommentaryTracksWithMediaType:AVMediaTypeAudio]对象索引:0]时间:commentaryTimeRange.start错误:零];
NSMutableArray*trackMixArray=[NSMutableArray];
CMTime rampDuration=CMTimeMake(1,2);//半秒斜坡
对于(i=0;i<[tracksToDuck count];i++){
AVMutableAudioMixInputParameters*trackMix=[AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:[tracksToDuck对象索引:i]];
[trackMix setVolumeRampFromStartVolume:1.0到EndVolume:0.2时间范围:CMTimeRangeMake(CMTimeSubtract(commentaryTimeRange.start,rampDuration),rampDuration)];
[trackMix setVolumeRampFromStartVolume:0.2到EndVolume:1.0时间范围:CMTimeRangeMake(CMTimeRangeGetEnd(commentaryTimeRange),rampDuration)];
[trackMixArray addObject:trackMix];
}
audioMix.inputParameters=trackMixArray;

}谢谢丹尼尔。我想起来了,很简单

AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];

AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio 
                                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) 
                                    ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] 
                                     atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo 
                                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) 
                               ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] 
                                atTime:kCMTimeZero error:nil];

AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition 
                                                                      presetName:AVAssetExportPresetPassthrough];   

NSString* videoName = @"export.mov";

NSString *exportPath = [NSTemporaryDirectory() stringByAppendingPathComponent:videoName];
NSURL    *exportUrl = [NSURL fileURLWithPath:exportPath];

if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) 
{
    [[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}

_assetExport.outputFileType = @"com.apple.quicktime-movie";
DLog(@"file type %@",_assetExport.outputFileType);
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;

[_assetExport exportAsynchronouslyWithCompletionHandler:
 ^(void ) {      
            // your completion code here
     }       
 }
 ];

以下是swift版本:

    func mixAudio(audioURL audioURL: NSURL, videoURL: NSURL) {
    let audioAsset = AVURLAsset(URL: audioURL)
    let videoAsset = AVURLAsset(URL: videoURL)

    let mixComposition = AVMutableComposition()

    let compositionCommentaryTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeAudio, preferredTrackID: kCMPersistentTrackID_Invalid)

    // add audio
    let timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration)
    let track = audioAsset.tracksWithMediaType(AVMediaTypeAudio)[0]
    do {
        try compositionCommentaryTrack.insertTimeRange(timeRange, ofTrack: track, atTime: kCMTimeZero)
    }
    catch {
        print("Error insertTimeRange for audio track \(error)")
    }

    // add video
    let compositionVideoTrack = mixComposition.addMutableTrackWithMediaType(AVMediaTypeVideo, preferredTrackID: kCMPersistentTrackID_Invalid)

    let timeRangeVideo = CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
    let trackVideo = videoAsset.tracksWithMediaType(AVMediaTypeVideo)[0]
    do {
        try compositionVideoTrack.insertTimeRange(timeRangeVideo, ofTrack: trackVideo, atTime: kCMTimeZero)
    }
    catch {
        print("Error insertTimeRange for video track \(error)")
    }

    // export
    let assetExportSession = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)
    let videoName = "export.mov"
    exportPath = "\(NSTemporaryDirectory())/\(videoName)"
    let exportURL = NSURL(fileURLWithPath: exportPath!)

    if NSFileManager.defaultManager().fileExistsAtPath(exportPath!) {
        do {
            try NSFileManager.defaultManager().removeItemAtPath(exportPath!)
        }
        catch {
            print("Error deleting export.mov: \(error)")
        }
    }

    assetExportSession?.outputFileType = "com.apple.quicktime-movie"
    assetExportSession?.outputURL = exportURL
    assetExportSession?.shouldOptimizeForNetworkUse = true
    assetExportSession?.exportAsynchronouslyWithCompletionHandler({ 
        print("Mixed audio and video!")
        dispatch_async(dispatch_get_main_queue(), {
            print(self.exportPath!)

        })
    })

}

这个演示在哪里?我在任何地方都找不到它的下载。我只合并了一个音频/视频文件。问题是,我的视频文件是40秒,音频文件是28秒。所以对于剩下的12(40-28)秒,我想在音频文件中从0秒开始重复它。我该怎么做?有没有直接的方法可以做到这一点?实际上,这段代码工作不正常。当我在我的项目中实现这段代码时,将崩溃在跟踪:[[audioAsset tracksWithMediaType:AVMediaTypeAudio]的[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,audioAsset.duration)]上objectAtIndex:0]时间:kCMTimeZero错误:nil];并显示索引0处没有对象。@Swastik我在Occasion上遇到了相同的问题,尤其是在处理来自iCloud的文件时。我的解决办法是做两件事。1) 验证我尝试使用的文件是否对我尝试使用的媒体类型有效,2)确保该文件实际包含数据(iCloud有时不包含这些数据)。嗨,Steve,我只合并一个音频/视频文件。问题是,我的视频文件是40秒,音频文件是28秒。所以对于剩下的12(40-28)秒,我想在音频文件中从0秒开始重复它。我该怎么做?有没有直接的方法可以做到这一点?同样的问题。。。。。。崩溃在[[audioAsset tracksWithMediaType:AVMediaTypeAudio]对象索引:0]索引0处没有对象,有人找到解决方案吗?@jayeshlathiya找到解决方案了吗??