Ios7 iOS:AVPlayerItem到AVAsset

Ios7 iOS:AVPlayerItem到AVAsset,ios7,avplayer,avasset,avvideocomposition,Ios7,Avplayer,Avasset,Avvideocomposition,我有两个AVAssets,我确实用视频合成和AVPlayerItem的AudioMix进行了更改。之后,我使用AVPlayerItem的资产,但不应用视频合成和音频混合。 我希望视频合成和音频混合应用结果资产。 这是密码 + (AVAsset *)InitAsset:(AVAsset *)asset AtTime:(double)start ToTime:(double)end { CGFloat colorComponents[4] = {1.0,1.0,1.0,0.0}; //Create

我有两个AVAssets,我确实用视频合成和AVPlayerItem的AudioMix进行了更改。之后,我使用AVPlayerItem的资产,但不应用视频合成和音频混合。 我希望视频合成和音频混合应用结果资产。 这是密码

+ (AVAsset *)InitAsset:(AVAsset *)asset AtTime:(double)start ToTime:(double)end {
CGFloat colorComponents[4] = {1.0,1.0,1.0,0.0};

//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];

//Here we are creating the first AVMutableCompositionTrack.See how we are adding a new track to our AVMutableComposition.
AVMutableCompositionTrack *masterTrack =
[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                            preferredTrackID:kCMPersistentTrackID_Invalid];
//Now we set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to out newly created track at kCMTimeZero so video plays from the start of the track.
[masterTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(start, 1), CMTimeMakeWithSeconds(end, 1))
                     ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                      atTime:kCMTimeZero error:nil];

// Each video layer instruction
AVMutableVideoCompositionLayerInstruction *masterLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:masterTrack];
[masterLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
[masterLayerInstruction setOpacityRampFromStartOpacity:1.0f
                                          toEndOpacity:0.0
                                             timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];

//See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[MainInstruction setTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
[MainInstruction setLayerInstructions:[NSArray arrayWithObjects:masterLayerInstruction,nil]];
[MainInstruction setBackgroundColor:CGColorCreate(CGColorSpaceCreateDeviceRGB(), colorComponents)];

//Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(1280, 720);
//    [MainCompositionInst setFra]

AVMutableCompositionTrack *masterAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[masterAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))
                     ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];

// Each Audio
AVMutableAudioMixInputParameters *masterAudioMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:masterAudio];
[masterAudioMix setVolume:1.0f atTime:kCMTimeZero];
[masterAudioMix setVolumeRampFromStartVolume:1.0f
                                 toEndVolume:0.0f
                                   timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
//    [SecondTrackMix setVolume:1.0f atTime:CMTimeMake(2.01, 1)];

AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObjects:masterAudioMix,nil];

//Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:mixComposition];
item.videoComposition = MainCompositionInst;
item.audioMix = audioMix;

return [item asset];
}
有人知道吗


致以最诚挚的问候。

使用AvassetExportSeision

AVAssetExportSession对象对AVAsset源对象的内容进行转码,以创建指定导出预设所述表单的输出

使用AvassetExportSeision的特性audioMix和videoComposition

混音 指示是否为导出启用非默认音频混合,并为音频混合提供参数

@PropertyNationomic,复制AVAudioMix*audioMix

视频合成 指示是否为导出启用视频合成,并提供视频合成的说明


@地产经济,复制AVVideoComposition*videoComposition

我不明白这个问题。您想更改某些内容,还是更改了某些内容但未应用?您可以编辑您的问题,以显示一些代码,您正在尝试做什么?谢谢您的答复。我已经编辑过了。我可以将tham导出到AVAsset而不是文件吗?不能使用AVAssetExportSession。为什么不直接使用AVAsset的方法+IDassetwithur:NSURL*URL?因为我需要重复这个方法,并且在重新打开之前导出到文件需要很多时间。