Ios 使用AVAssetExportSession和AVMutableComposition制作AVFileTypeMPEG4视频文件

Ios 使用AVAssetExportSession和AVMutableComposition制作AVFileTypeMPEG4视频文件,ios,Ios,我正在使用“OWVideoProcessor”库来剪切部分现场录制的视频。这段视频在任何苹果设备上都可以正常播放,但当我在浏览器(Dropbox)上播放它时,它会在前面加上一些秒,而在前面加上的秒数也会丢失音频。您可以在此处看到这些视频的示例: 如果你在苹果设备上下载视频,视频有20秒。如果在浏览器中播放,则有29秒 这是用于缝合视频的代码: - (void)stitchVideoWithDestinationPath:(NSString *)destinationPath completio

我正在使用“OWVideoProcessor”库来剪切部分现场录制的视频。这段视频在任何苹果设备上都可以正常播放,但当我在浏览器(Dropbox)上播放它时,它会在前面加上一些秒,而在前面加上的秒数也会丢失音频。您可以在此处看到这些视频的示例: 如果你在苹果设备上下载视频,视频有20秒。如果在浏览器中播放,则有29秒

这是用于缝合视频的代码:

 - (void)stitchVideoWithDestinationPath:(NSString *)destinationPath completion:(void(^)(NSError *error))completion {  
    [self.exportSession cancelExport];  

    NSLog(@"export started to path: %@", destinationPath);  

    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];  
    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];  
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];  

    CMTime startTime = kCMTimeZero;  

    int lastIndex = self.segmentStart + self.segmentCount - 1;  
    NSLog(@"Stitching segments in interval: [%d - %d]", self.segmentStart, lastIndex);  

    for (int i = self.segmentCount - 5; i < lastIndex; i++) {  
        CMTimeShow(startTime);  
        NSURL *url = [OWUtilities urlForRecordingSegmentCount:i basePath:self.basePath];  
        AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:@{AVURLAssetPreferPreciseDurationAndTimingKey: @(YES)}];  
        NSAssert(asset, @"Invalid asset at: %@", url);  

        BOOL hasAllTracks = [[asset tracks] count] >= 2;  
        if (hasAllTracks) {  
            CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, asset.duration);  

            AVAssetTrack *track = nil;  
            track = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];  
            [videoTrack insertTimeRange:timeRange ofTrack:track atTime:startTime error:nil];  

            track = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];  
            [audioTrack insertTimeRange:timeRange ofTrack:track atTime:startTime error:nil];  

            startTime = CMTimeAdd(startTime, asset.duration);  
        }  
    }  
    NSTimeInterval segmentsDuration = CMTimeGetSeconds(startTime);  
    NSLog(@"Total segments duration: %.2f", segmentsDuration);  

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetPassthrough];  

    if (![[NSFileManager defaultManager] fileExistsAtPath:destinationPath]) {  
        NSArray *filePathsArray = [NSArray new];  
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);  
        NSString *documentsDirectory = [paths objectAtIndex:0];  
        filePathsArray = [[NSFileManager defaultManager] subpathsOfDirectoryAtPath:documentsDirectory  error:nil];  
        documentsDirectory = [documentsDirectory stringByAppendingString:@"/uploads/"];  
        documentsDirectory = [documentsDirectory stringByAppendingString:[destinationPath lastPathComponent]];  

        if([[NSFileManager defaultManager] fileExistsAtPath:documentsDirectory]) {  
            destinationPath = documentsDirectory;  
        }  
    }  
    exporter.outputURL = [NSURL fileURLWithPath:destinationPath];  
    exporter.outputFileType = AVFileTypeMPEG4;  
    BOOL trimRange = (segmentsDuration > self.outputSegmentDuration);  
    if (trimRange) {  
        CMTime duration = CMTimeMakeWithSeconds(self.outputSegmentDuration, startTime.timescale);  
        NSTimeInterval startInterval = segmentsDuration - self.outputSegmentDuration;  
        CMTime start = CMTimeMakeWithSeconds(startInterval, startTime.timescale);  
        exporter.timeRange = CMTimeRangeMake(start, duration);  

        NSLog(@"Exporting segment:");  
        CMTimeRangeShow(exporter.timeRange);  
        NSTimeInterval segmentsDuration2 = CMTimeGetSeconds(duration);  
        NSLog(@"Total segments duration: %.2f", segmentsDuration2);  
    }  


    @weakify(self, exporter);  
    [exporter exportAsynchronouslyWithCompletionHandler:^{  
        @strongify(self, exporter);  
        NSLog(@"error: %@", exporter.error);  
        if (completion && (exporter.status != AVAssetExportSessionStatusCancelled)) {  
            completion(exporter.error);  
        } else {  
            completion(nil);  
        }  


        if (self.exportSession == exporter) {  
            self.exportSession = nil;  
        }  
    }];  

    self.exportSession = exporter;  
}
-(void)缝合视频WithDestinationPath:(NSString*)destinationPath完成:(void(^)(NSError*error))完成{
[self.exportSession取消导出];
NSLog(@“导出开始到路径:%@”,目标路径);
AVMutableComposition*mixComposition=[[AVMutableComposition alloc]init];
AVMutableCompositionTrack*videoTrack=[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID\u无效];
AVMutableCompositionTrack*audioTrack=[mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID\u无效];
CMTime startTime=KCMTIME0;
int lastIndex=self.segmentStart+self.segmentCount-1;
NSLog(@“在区间内缝合段:[%d-%d]”,self.segmentStart,lastIndex);
对于(inti=self.segmentCount-5;i=2;
如果(hasAllTracks){
CMTimeRange timeRange=CMTimeRangeMake(kCMTimeZero,asset.duration);
AVAssetTrack*轨道=零;
track=[[asset tracksWithMediaType:AVMediaTypeVideo]对象索引:0];
[videoTrack insertTimeRange:track的时间范围:track atTime:startTime错误:无];
track=[[asset tracksWithMediaType:AVMediaTypeAudio]对象索引:0];
[audioTrack insertTimeRange:track的时间范围:track atTime:startTime错误:无];
startTime=CMTimeAdd(startTime,asset.duration);
}  
}  
NSTimeInterval segmentsDuration=CMTimeGetSeconds(startTime);
NSLog(@“总分段持续时间:%.2f”,分段持续时间);
AVAssetExportSession*exporter=[[AVAssetExportSession alloc]initWithAsset:mixComposition预设名称:AVAssetExportPresetPassthrough];
如果(![[NSFileManager defaultManager]fileExistsAtPath:destinationPath]){
NSArray*filePathsArray=[NSArray新];
NSArray*Path=NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask,是);
NSString*documentsDirectory=[paths objectAtIndex:0];
filePathsArray=[[NSFileManager defaultManager]子路径目录路径:documentsDirectory错误:nil];
documentsDirectory=[documentsDirectory stringByAppendingString:@/上传/”;
documentsDirectory=[documentsDirectory stringByAppendingString:[destinationPath lastPathComponent]];
如果([[NSFileManager defaultManager]fileExistsAtPath:documentsDirectory]){
destinationPath=文档目录;
}  
}  
exporter.outputURL=[NSURL fileURLWithPath:destinationPath];
exporter.outputFileType=AVFileTypeMPEG4;
BOOL trimRange=(分段持续时间>自输出分段持续时间);
如果(范围){
CMTime duration=CMTimeMakeWithSeconds(self.outputSegmentDuration,startTime.timescale);
NSTimeInterval startInterval=分段持续时间-self.outputSegmentDuration;
CMTime start=CMTimeMakeWithSeconds(startInterval,startTime.timescale);
exporter.timeRange=CMTimeRangeMake(开始、持续时间);
NSLog(@“导出段:”);
CMTimeRangeShow(exporter.timeRange);
NSTimeInterval segmentsDuration2=CMTimeGetSeconds(持续时间);
NSLog(@“总分段持续时间:%.2f”,分段持续时间2);
}  
@weakify(自营、出口);
[exporter exportAsynchronouslyWithCompletionHandler:^{
@strongify(自营、出口);
NSLog(@“error:%@”,exporter.error);
如果(完成和(exporter.status!=AVAssetExportSessionStatusCancelled)){
完成(出口商错误);
}否则{
完成(无);
}  
如果(self.exportSession==导出器){
self.exportSession=nil;
}  
}];  
self.exportSession=导出器;
}

问题不在上面的代码中。问题就在这里:

NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInteger:width], AVVideoWidthKey,
                                              [NSNumber numberWithInteger:height], AVVideoHeightKey,
                                              [NSDictionary dictionaryWithObjectsAndKeys:
                                               [NSNumber numberWithInteger: bps ], AVVideoAverageBitRateKey,
                                               [NSNumber numberWithInteger:300], AVVideoMaxKeyFrameIntervalKey,
                                               nil], AVVideoCompressionPropertiesKey,
                                              nil]; 
此代码用于设置视频压缩设置。AVVideoAverageBitRateKey设置为低(如600 kbit/s),而AVVideoMaxKeyFrameIntervalKey设置得太大。因此,我将AVVideoMaxKeyFrameIntervalKey更改为1,并将AVVideoAverageBitRateKey增加到5000 kbit/s。这解决了我的问题


编写此代码是为了减小视频大小。您可以在OWVideoProcessor库中更改它。

问题不在上面的代码中。问题就在这里:

NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                              AVVideoCodecH264, AVVideoCodecKey,
                                              [NSNumber numberWithInteger:width], AVVideoWidthKey,
                                              [NSNumber numberWithInteger:height], AVVideoHeightKey,
                                              [NSDictionary dictionaryWithObjectsAndKeys:
                                               [NSNumber numberWithInteger: bps ], AVVideoAverageBitRateKey,
                                               [NSNumber numberWithInteger:300], AVVideoMaxKeyFrameIntervalKey,
                                               nil], AVVideoCompressionPropertiesKey,
                                              nil]; 
此代码用于设置视频压缩设置。AVVideoAverageBitRateKey设置为低(如600 kbit/s),而AVVideoMaxKeyFrameIntervalKey设置得太大。因此,我将AVVideoMaxKeyFrameIntervalKey更改为1,并将AVVideoAverageBitRateKey增加到5000 kbit/s。这解决了我的问题

编写此代码是为了减小视频大小。您可以在OWVideoProcessor库中进行更改