Ios AVMutableCompositionTrack设置卷不工作
在混合录制的视频和来自资源的音频时,我无法设置音频音量 这是我的密码:Ios AVMutableCompositionTrack设置卷不工作,ios,iphone,audio,video,Ios,Iphone,Audio,Video,在混合录制的视频和来自资源的音频时,我无法设置音频音量 这是我的密码: AVMutableComposition *composition = [[AVMutableComposition alloc] init]; NSString *resourcePath = [[NSBundle mainBundle] pathForResource:@"give-it-away" ofType:@"mp3"]; AVURLAsset *audioAsset = [[AVURLAsset alloc]
AVMutableComposition *composition = [[AVMutableComposition alloc] init];
NSString *resourcePath = [[NSBundle mainBundle] pathForResource:@"give-it-away" ofType:@"mp3"];
AVURLAsset *audioAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:resourcePath] options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES],AVURLAssetPreferPreciseDurationAndTimingKey, nil]];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[videoAsset.tracks objectAtIndex:0] atTime:kCMTimeZero error:nil];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioTime) ofTrack:[audioAsset.tracks objectAtIndex:0] atTime:kCMTimeZero error:&videoError];
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:audioTrack] ;
[audioInputParams setVolume:0.3 atTime:kCMTimeZero];
[audioInputParams setTrackID:audioTrack.trackID];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObject:audioInputParams];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetPassthrough];
exportSession.outputURL = [NSURL fileURLWithPath:finalVideoWithAudioPath];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
exportSession.audioMix = audioMix;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
{
[self performSelectorOnMainThread:@selector(doPostExportFailed) withObject:nil waitUntilDone:NO];
break;
}
case AVAssetExportSessionStatusCompleted:
{
[self performSelectorOnMainThread:@selector(doPostExportSuccess) withObject:nil waitUntilDone:YES];
break;
}
};
}];
导出成功完成,但音频音量不变。
我做错了什么
谢谢更新(2018年12月)
此代码在iOS 12.1.2上对我有效:
- (void) combineAudio:(NSString*)audioPath forRecord:(VideoRecord*)record isResource:(BOOL)isResource isSilent:(BOOL)isSilent keepCurrentAudio:(BOOL)keepCurrentAudio withCompletionHandler:(void (^)(AVAssetExportSession* exportSession, NSString* exportPath))handler {
NSString *resourcePath = audioPath;
if (isResource) {
resourcePath = [[NSBundle mainBundle] pathForResource:resourcePath ofType:@"mp3"];
}
NSURL *url = [NSURL fileURLWithPath:resourcePath];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:[NSString stringWithFormat:@"file://%@",record.videoPath]] options:nil];
AVMutableComposition *composition = [self getComposition];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioOriginalTrack = nil;
if (keepCurrentAudio && [videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) {
compositionAudioOriginalTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioOriginalTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
if (isResource) {
CMTime videoDuration = videoAsset.duration;
if(CMTimeCompare(videoDuration, audioAsset.duration) == -1){
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
} else if(CMTimeCompare(videoDuration, audioAsset.duration) == 1) {
CMTime currentTime = kCMTimeZero;
while(YES){
CMTime audioDuration = audioAsset.duration;
CMTime totalDuration = CMTimeAdd(currentTime,audioDuration);
if(CMTimeCompare(totalDuration, videoDuration)==1){
audioDuration = CMTimeSubtract(totalDuration,videoDuration);
}
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioDuration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:currentTime error:nil];
currentTime = CMTimeAdd(currentTime, audioDuration);
if(CMTimeCompare(currentTime, videoDuration) == 1 || CMTimeCompare(currentTime, videoDuration) == 0){
break;
}
}
}
} else {
NSArray<AVAssetTrack *>* aTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
if (aTracks.count > 0) {
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[aTracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
}
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *tracks = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
if (tracks.count == 0) {
CLSNSLog(@"%@ - combineAudio - video tracks zero", NSStringFromClass([self class]));
// TODO - Handle this error.
return;
}
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[tracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:[composition copy]
presetName:AVAssetExportPresetHighestQuality];
NSString *exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".mp4" withString:@"_audio_added.mp4"];
if ([record.videoPath containsString:@".MOV"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".MOV" withString:@"_audio_added.mp4"];
}
if ([record.videoPath containsString:@".mov"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".mov" withString:@"_audio_added.mp4"];
}
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
float volume = .5f;
if (keepCurrentAudio) {
volume = .6f;
}
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : volume) toEndVolume:(isSilent ? .0f : volume) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioTrack.trackID];
NSArray *inputParams = [NSArray arrayWithObject:audioInputParams];
AVMutableAudioMixInputParameters *audioOriginalInputParams = nil;
if (keepCurrentAudio) {
audioOriginalInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioOriginalTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : .06f) toEndVolume:(isSilent ? .0f : .06f) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioOriginalTrack.trackID];
inputParams = [NSArray arrayWithObjects:audioInputParams, audioOriginalInputParams, nil];
}
audioMix.inputParameters = inputParams;
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.audioMix = audioMix;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
handler(_assetExport, exportPath);
}];
}
VideoRecord类包含我的视频所需的所有数据。但在本例中,它只使用视频路径,因此您可以将其更改为简单的NSString
希望有帮助。更新(2018年12月)
此代码在iOS 12.1.2上对我有效:
- (void) combineAudio:(NSString*)audioPath forRecord:(VideoRecord*)record isResource:(BOOL)isResource isSilent:(BOOL)isSilent keepCurrentAudio:(BOOL)keepCurrentAudio withCompletionHandler:(void (^)(AVAssetExportSession* exportSession, NSString* exportPath))handler {
NSString *resourcePath = audioPath;
if (isResource) {
resourcePath = [[NSBundle mainBundle] pathForResource:resourcePath ofType:@"mp3"];
}
NSURL *url = [NSURL fileURLWithPath:resourcePath];
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:url options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:[NSURL URLWithString:[NSString stringWithFormat:@"file://%@",record.videoPath]] options:nil];
AVMutableComposition *composition = [self getComposition];
AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioOriginalTrack = nil;
if (keepCurrentAudio && [videoAsset tracksWithMediaType:AVMediaTypeAudio].count > 0) {
compositionAudioOriginalTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionAudioOriginalTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
if (isResource) {
CMTime videoDuration = videoAsset.duration;
if(CMTimeCompare(videoDuration, audioAsset.duration) == -1){
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
} else if(CMTimeCompare(videoDuration, audioAsset.duration) == 1) {
CMTime currentTime = kCMTimeZero;
while(YES){
CMTime audioDuration = audioAsset.duration;
CMTime totalDuration = CMTimeAdd(currentTime,audioDuration);
if(CMTimeCompare(totalDuration, videoDuration)==1){
audioDuration = CMTimeSubtract(totalDuration,videoDuration);
}
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioDuration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:currentTime error:nil];
currentTime = CMTimeAdd(currentTime, audioDuration);
if(CMTimeCompare(currentTime, videoDuration) == 1 || CMTimeCompare(currentTime, videoDuration) == 0){
break;
}
}
}
} else {
NSArray<AVAssetTrack *>* aTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
if (aTracks.count > 0) {
[compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[aTracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
}
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSArray *tracks = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
if (tracks.count == 0) {
CLSNSLog(@"%@ - combineAudio - video tracks zero", NSStringFromClass([self class]));
// TODO - Handle this error.
return;
}
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[tracks objectAtIndex:0]
atTime:kCMTimeZero error:nil];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:[composition copy]
presetName:AVAssetExportPresetHighestQuality];
NSString *exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".mp4" withString:@"_audio_added.mp4"];
if ([record.videoPath containsString:@".MOV"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".MOV" withString:@"_audio_added.mp4"];
}
if ([record.videoPath containsString:@".mov"]) {
exportPath = [record.videoPath stringByReplacingOccurrencesOfString:@".mov" withString:@"_audio_added.mp4"];
}
NSURL *exportUrl = [NSURL fileURLWithPath:exportPath];
if ([[NSFileManager defaultManager] fileExistsAtPath:exportPath]) {
[[NSFileManager defaultManager] removeItemAtPath:exportPath error:nil];
}
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
float volume = .5f;
if (keepCurrentAudio) {
volume = .6f;
}
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : volume) toEndVolume:(isSilent ? .0f : volume) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioTrack.trackID];
NSArray *inputParams = [NSArray arrayWithObject:audioInputParams];
AVMutableAudioMixInputParameters *audioOriginalInputParams = nil;
if (keepCurrentAudio) {
audioOriginalInputParams = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:compositionAudioOriginalTrack] ;
[audioInputParams setVolumeRampFromStartVolume:(isSilent ? .0f : .06f) toEndVolume:(isSilent ? .0f : .06f) timeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)];
[audioInputParams setTrackID:compositionAudioOriginalTrack.trackID];
inputParams = [NSArray arrayWithObjects:audioInputParams, audioOriginalInputParams, nil];
}
audioMix.inputParameters = inputParams;
_assetExport.outputFileType = AVFileTypeQuickTimeMovie;
_assetExport.audioMix = audioMix;
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:^{
handler(_assetExport, exportPath);
}];
}
VideoRecord类包含我的视频所需的所有数据。但在本例中,它只使用视频路径,因此您可以将其更改为简单的NSString
希望有帮助。有人知道吗?谢谢你能解决这个问题吗?有人知道吗?谢谢你能解决这个问题吗?非常感谢@FabioThanks提供的代码,但我在iOS 10.3上无法使用它。有人有同样的问题吗?你好,这对我不适用。你有其他解决办法吗?嗨@Khush,我刚刚更新了我的答案。这段代码在iOS 12.1.2上对我有效。@Khush,@werner kratochwil,我发现用
AVAssetExportPresetPassthrough
导出不起作用。非常感谢@FabioThanks提供的代码,但我无法在iOS 10.3上使用它。有人有同样的问题吗?你好,这对我不适用。你有其他解决办法吗?嗨@Khush,我刚刚更新了我的答案。这段代码在iOS 12.1.2上对我有效。@Khush,@werner kratochwil,我发现用AVAssetExportPresetPassthrough
导出不起作用。